follow CCP

Recent blog entries
popular papers

What Is the "Science of Science Communication"?

Climate-Science Communication and the Measurement Problem

Ideology, Motivated Cognition, and Cognitive Reflection: An Experimental Study

'Ideology' or 'Situation Sense'? An Experimental Investigation of Motivated Reasoning and Professional Judgment

A Risky Science Communication Environment for Vaccines

Motivated Numeracy and Enlightened Self-Government

Ideology, Motivated Cognition, and Cognitive Reflection: An Experimental Study

Making Climate Science Communication Evidence-based—All the Way Down 

Neutral Principles, Motivated Cognition, and Some Problems for Constitutional Law 

Cultural Cognition of Scientific Consensus
 

The Tragedy of the Risk-Perception Commons: Science Literacy and Climate Change

"They Saw a Protest": Cognitive Illiberalism and the Speech-Conduct Distinction 

Geoengineering and the Science Communication Environment: a Cross-Cultural Experiment

Fixing the Communications Failure

Why We Are Poles Apart on Climate Change

The Cognitively Illiberal State 

Who Fears the HPV Vaccine, Who Doesn't, and Why? An Experimental Study

Cultural Cognition of the Risks and Benefits of Nanotechnology

Whose Eyes Are You Going to Believe? An Empirical Examination of Scott v. Harris

Cultural Cognition and Public Policy

Culture, Cognition, and Consent: Who Perceives What, and Why, in "Acquaintance Rape" Cases

Culture and Identity-Protective Cognition: Explaining the White Male Effect

Fear of Democracy: A Cultural Evaluation of Sunstein on Risk

Cultural Cognition as a Conception of the Cultural Theory of Risk

« "Religion, not political predispositions or political elite discourse, generates conflict over science" Seriously?!!! | Main | How *cognitive* adaptation relates to mitigating a polluted science communication environment »
Wednesday
Nov052014

Still more on the concept/value of a "science of science journalism"

From correspondence with a friend:

You mentioned you were eager to learn what I had in mind about how to use the science of science communication to improve science journalism.  I'm sure you can guess what I'd say: you tell me -- & I'll measure!  

We've talked about this philosophy, of course.  I think all the professions that traffic in the dissemination of what's known by science can benefit from the use of science's signature methods to improve their craft.  Not b/c those methods furnish a substitute for the exercise of professional judgment or craft sense; but b/c they are suited for generating information -- and inspiring informative action -- that those with professional judgment would recognize as valuable.  

These methods are uniquely suited for doing that, I think, on questions that experienced professionals themselves recognize as having competing plausible answers. In that situation, there will be no progress through more & more talk, in the form, say, of perennial panels that rehash the opposing positions yr after yr at professional conferences, as predictably entertaining as those are!

What's needed are appropriate tests -- ones designed to generate observations the nature of which will give the professionals at least some more reason than they had previously for crediting one or another of their competing surmises.

Those tests are unlikely to definitively resolve any particular disputed issue! 

But they can be expected to infuse new information into science journalists' own continuous process of professional self-assessment.  They can be expected, too, to inspire particular practitioners to try something new in their work, generating outcomes that can themselves supply a basis for additional reflective assessment.

As a result, the ongoing critical engagement of science journalists with their own craft norms will unfold in a manner that these professionals will themselves find more satisfying.

But if you ask me what to do, then you are not fully grasping what I'm saying!  

I am not of your profession; I don't have your craft sense, your professional judgment.  

There are some things that I can do, using my own craft judgment and skill to the best of my ability, that will give you relevant information.  I can do an experiment, for example, designed to pit two of your plausible conjectures against each other & generate the information that would give you more reason to view one or the other as more likely true than you previously had.  

But you must tell me what the plausible conjectures are. 

You must tell me whether the design I have crafted is such that it really will generate a result that those w/ professional judgment would regard as supporting the sort of inference I'm describing. 

And most importantly of all, once we are done with that experiment, you must tell me what you think can be done in the real world, the particulars of which were stripped away in our study so we could be confident we knew what was happening and why, to reproduce the effect observed in the lab.  At which point -- I will again help by measuring: that is, by applying my knowledge to figure out how to fit to your real-world activity some apparatus for collecting observations on the basis of which you will have more reason than you otherwise would have had to think that what you are doing is or isn't working.

After that--or better still over the course of the process, at the various stages at which there are observations to share--you will go to the professional conference & describe what you have been up to. And everyone will talk about what can be learned.  Professional judgment will continue to evolve in the way that it always has --in response to members’ reflective engagement with their shared experiences--but now with the benefit of this additional input on a disputed issue that had been resisting resolution with the information previously at hand.

This is what it is like to have a genuine evidence-based culture within a profession. 

To have people from outside your profession do stylized studies & then purport to tell you what do is not.  Not seeing that is, I think, is  one reason that science journalists report getting little value from events like the Sackler Science of Science Communication symposia. You actually should be dissatisfied if researchers who do what I do--conduct studies designed to explore the relative significance of alternative mechanisms thought to be of consequence for one or another aspect of science communication -- tell you "here's what to do"; b/c they don't know  how to connect that relevant research to practice & shouldn't pretend to (really really really shouldn't; I think it is in fact unethical for them to peddle “how to” advice manuals and the like to science communicators—rather than being clear on the need for evidence-based practice “all the way down”). 

But in turn, you shouldn't expect that sort of counsel from them!  You have the situation sense that is essential to figuring out how to translate the relevant lab studies into practices that might plausibly link up with what the studies have identified to be the relevant mechanisms; at which point, there is again a role to be played by those who measure. 

So -- don't ask what the science of science communication can do for your; instead ask, "What can I do with the science of science communication for myself."

I am saying only that making this sort of evidence-based practice a part of the professional culture of science journalism -- along with all the other professions that traffic in disseminating what's known by science -- will make the evolution of its members' professional judgment better by their members' own lights.

That's a hypothesis!  I'm happy to help anyone in these professions test it.

--Dan

p.s. I’ve addressed this before; there is a groundhog-day quality to discussing the need for & character of the “science of science communication."  But that’s okay, b/c you actually can sometimes make the same day a bit better or more complete than it was last time -- & can hyperlink to things that still seem to make sense.

Anyway, here are some relevant posts:

PrintView Printer Friendly Version

EmailEmail Article to Friend

Reader Comments

There are no comments for this journal entry. To create a new comment, use the form below.

PostPost a New Comment

Enter your information below to add a new comment.

My response is on my own website »
Author Email (optional):
Author URL (optional):
Post:
 
Some HTML allowed: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <code> <em> <i> <strike> <strong>