follow CCP

Recent blog entries
popular papers

Science Curiosity and Political Information Processing

What Is the "Science of Science Communication"?

Climate-Science Communication and the Measurement Problem

Ideology, Motivated Cognition, and Cognitive Reflection: An Experimental Study

'Ideology' or 'Situation Sense'? An Experimental Investigation of Motivated Reasoning and Professional Judgment

A Risky Science Communication Environment for Vaccines

Motivated Numeracy and Enlightened Self-Government

Making Climate Science Communication Evidence-based—All the Way Down 

Neutral Principles, Motivated Cognition, and Some Problems for Constitutional Law 

Cultural Cognition of Scientific Consensus
 

The Tragedy of the Risk-Perception Commons: Science Literacy and Climate Change

"They Saw a Protest": Cognitive Illiberalism and the Speech-Conduct Distinction 

Geoengineering and the Science Communication Environment: a Cross-Cultural Experiment

Fixing the Communications Failure

Why We Are Poles Apart on Climate Change

The Cognitively Illiberal State 

Who Fears the HPV Vaccine, Who Doesn't, and Why? An Experimental Study

Cultural Cognition of the Risks and Benefits of Nanotechnology

Whose Eyes Are You Going to Believe? An Empirical Examination of Scott v. Harris

Cultural Cognition and Public Policy

Culture, Cognition, and Consent: Who Perceives What, and Why, in "Acquaintance Rape" Cases

Culture and Identity-Protective Cognition: Explaining the White Male Effect

Fear of Democracy: A Cultural Evaluation of Sunstein on Risk

Cultural Cognition as a Conception of the Cultural Theory of Risk

« Are misconceptions of science & misinformation the *problem* in the Science Communication Problem?... | Main | Next stop: Metcalf Institute »
Monday
Jun052017

Scicomm-centerism: Another “selecting on the dependent variable” saga

Okay, this is an argument that I tried to make on my Saturday panel at the World Science Festival & that went over like a fourth-dimension lead balloon. Surely, this is b/c of my own limitations as a science communicator (studying and doing are very different things!).

Cultural polarization affects science but is not about science!

This point—which I invite the 14 billion readers of this blog to help reduce to a better, more descriptive, more evocative sentence—is of vital importance to science communication because it forecloses many explanations of public science controversies and prescriptions for how they should be addressed.

As is usually the case with concepts like these, the best way to make the point is to demonstrate it.

So consider the CCP study reported in They Saw a Protest (2012)

There subjects, instructed to play the role of mock jurors in a civil trial, watched a film of a political protest that the police had broken up.  The protesters claimed they were involved in peaceful if vigorous debate protected by the First Amendment; the police, in contrast, asserted that the demonstrators had crossed the line to intimidation & coercion, which are not “free speech” for purposes of the U.S. Constitution.

There was an experimental component. We told half the subjects that the the protest occurred at an abortion clinic, and that the demonstrators opposed the rights established by Roe v. Wade and its progeny. The other half were told that the protest occurred outside a military recruitment center, and that the demonstrators were criticizing the policy of excluding openly gay and lesbian citizens from the military.

We saw big effects.

Subjects with different cultural values reported seeing different things (protesters blocking pedestrians and screaming in their faces vs. vigorous but peaceful exhortations) if they were assigned to the same condition and thus thought they were watching the same sort of protest.

Be 5000th clicker & win a prize!At the same time, subjects with the same values disagreed with one another on those same facts, and on the proper disposition of the case (order to enjoin police from interfering with future protests), if they were assigned to different experimental conditions and thus thought they were watching  different kinds of protests (anti-abortion vs. nondiscriminatory military recruitment).

These are the same groups that are divided over issues like climate change, nuclear power, fracking etc.

This is a powerful demonstration of how cultural cognation can generate culturally polarized reactions to facts.  But obviously the trigger of such conflict had nothing to do with science—nothing to do, that is, with specific science issues or with one or another group’s position on any such issue.

Because the mechanisms at work in the study (identity-protective cogntion, in particular) are the same ones at work in debates over what is known about nuclear power, climate change, fracking, the HPV vaccine etc., the study strongly suggests that scholars and activists who are centering their attention exclusively on comprehension of science are making a grievous error.

That is, if what is clearly not a disputed-science conflict provokes exactly the species of motivated reasoning that divides cultural groups on science, then it is implausible to believe that anything intrinsic to science—e.g., the “uncertainty” of science, “trust in science,” “acceptance of the authority of science” etc.—drives cultural polarization on science issues, or that remedies designed specifically to address those kinds of barriers to comprehension of science will have any effect.

This is another instance of the errors of inference one can make if one selects on the dependent variable—that is, populates his or her set of observations with ones that presuppose the truth of the hypothesis being tested.  Here, science communication scholars and practitioners are formulating their explanations of, and prescriptions for, science conflicts without reference to whether the cognitive and social dynamics in question apply in any non-science setting.

What I’m saying here does not imply there aren't solutions to public conflicts over science—only that the solutions that treat science conflicts as unique or as uniquely focused on mechanisms of comprehension of science are bound to be mistaken.

Indeed, once one recognizes that many non-science cultural conflicts exhibit exactly the same sorts of biases in factual perceptions, then one's access to potential explanations and remedies are likely to widen or in any case become much more accurate and effective—because in that case one will have access to the empirical work of scholars who’ve been studying pertinent dynamics of cultural polarization outside the setting of science controversies.

What are those researchers discovering?

That’s open to debate, of course. But in my view, they see more value in the “law of social proof”—the principle that holds that individuals will generally conform their behavior to that of others with whom they identify and whom they understand to be informed, socially competent actors.

In any case, I’m less concerned with identifying exactly what casting aside the science-controversy blinders will teach us than I am with helping science communicators to understand why they should cast the blinders aside. 

If they did that--if they jettisoned the “scicomm centerism” that is now shaping their work—what they’d be enabled to see would vastly enrich their craft.

PrintView Printer Friendly Version

EmailEmail Article to Friend

Reader Comments (21)

RIP proofreading. And from a Yale blog.

June 5, 2017 | Unregistered CommenterTony Criswell

Dan -

=={ if what is clearly not a disputed-science conflict provokes exactly the species of motivated reasoning that divides cultural groups on science, then it is implausible to believe that anything intrinsic to science—e.g., the “uncertainty” of science, “trust in science,” “acceptance of the authority of science” etc.—drives cultural polarization on science issues, or that remedies designed specifically to address those kinds of barriers to comprehension of science will have any effect. }==


I have asked you many times for quite a while to explain what measures you use to distinguish a "science-communication" problem from communicative patterns that play out (in it seems to me a more or less identical way) in any variety of other contexts where there is polarization in association with identity-orientation. I can't recall you ever giving me an answer to those questions (at least one I could understand).

Along a similar line, it seems likely, to me that your contributing influence of "scientific curiosity" is actually a particular manifestation of the influence of attributes more deeply rooted in human cognition and psychology - such as the influence of personal experience or culture of origin.

June 5, 2017 | Unregistered CommenterJoshua

=={ if what is clearly not a disputed-science conflict provokes exactly the species of motivated reasoning that divides cultural groups on science, then it is implausible to believe that anything intrinsic to science—e.g., the “uncertainty” of science, “trust in science,” “acceptance of the authority of science” etc.—drives cultural polarization on science issues, or that remedies designed specifically to address those kinds of barriers to comprehension of science will have any effect. }==

Which is why I have been pounding on the principles of conflict resolution and participatory democracy as vehicles for approaching polarization around climate change. I also think that there is evidence about 'meta-cognition' which applies as well - which is why i believe that focusing on the mechanism of motivated reasoning in and of itself is a way to mitigate the influence of motivated reasoning. Even if it doesn't make motivated reasoning go away, it provides a tool for recognizing and mitigating its influence.

June 5, 2017 | Unregistered CommenterJoshua

@Joshua

1. Can you point out the previous instances in which you raised the issue? I don't recall them, so I can't say whether I've shifted positions or simply become less obtuse

2. "more deeply rooted" --- than what?

June 5, 2017 | Registered CommenterDan Kahan

Dan -

A couple of examples,... from this thread:

Also - from a theoretical perspective, why do you distinguish "science-based" communication from any other form of communication? What does "science-based communication" mean? What are the factors that differentiate "science-based" communication - as something other than just a (more or less) arbitrary selection of communication topic? I feel like I've asked you that questions before. Maybe you've answered it but I didn't understand your answer?

and

....but I still have basically the same question. What differentiates evidence-based science communication from any other brand of evidence-based communication - say evidence-based communication about gun control or the merits/demerits of Obamacare? Why do you single out evidence-based science communication? Merely because that is your point of interest, or because you believe it is somehow a different species than other domains of evidence-based communication?

June 5, 2017 | Unregistered CommenterJoshua

Cultural polarization affects science but is not about science!

Maybe, sloganized: science is caught in the cross-hairs of polarization.

June 5, 2017 | Unregistered CommenterMichael pershan

It would be interesting to see what "objective" measures show. Perhaps people given no information about the setting for the protests? Or even better a computer that could score the events of interest. Of course, then, it would be necessary to define a boundary between appropriate and inappropriate behavior, an ambiguity that might contribute to the effect. Reminiscent of the much older study of students from two schools looking for violations in a movie of a football game between their respective teams (was it Yale vs Princeton). Would also be interesting to see similar study with situations where there is a well-accepted position by experts (e.g., climate change). All biases are probably not equal.

June 5, 2017 | Unregistered CommenterJim Clark

And btw -

you say:

=={ That is, if what is clearly not a disputed-science conflict provokes exactly the species of motivated reasoning that divides cultural groups on science, then it is implausible to believe that anything intrinsic to science—e.g., the “uncertainty” of science, “trust in science,” “acceptance of the authority of science” etc.—drives cultural polarization on science issues, or that remedies designed specifically to address those kinds of barriers to comprehension of science will have any effect. }==

You seem to separate out science curiosity from the domain of other science-related factors that are "intrinsic to science" I don't understand that? Why wouldn't it just be curiosity that happens to play out - for some people - in a science-related domain? If so, then wouldn't the same root, a more deeply rooted characteristic of curiosity (or even another attribute that seems to manifest as curiosity), play out for other people in other domains?

June 5, 2017 | Unregistered CommenterJoshua

I think there are some important differences between this Rashomon-like experiment vs. those about science issues. It might still be the case that what is going on is "exactly the species of motivated reasoning that divides cultural groups on science". However, that's a strong assumption to make without consideration of the differences first.

Some key differences:

1. people generally consider themselves competent in judgement of others in social circumstances, but not competent (or at least much less competent) in judgement of scientific results (Dunning-Kruger effects aside)

2. large affect is apparent in such social cases but not in science issues - such that dispassionate judgement abilities may be reduced

3. science is an attempt to remove bias and direct itself towards objective truths - how this is understood could impact judgement of scientific results

June 5, 2017 | Unregistered CommenterJonathan

Jonathan -

I'm struggling to understand your comment.

=={ 1. people generally consider themselves competent in judgement of others in social circumstances, but not competent (or at least much less competent) in judgement of scientific results (Dunning-Kruger effects aside) }==

I'm trying to figure out how you think that plays into the phenomenon at hand. Are you saying that a more widespread insecurity about evaluating science would to a likelihood that something intrinsic to science (or how people relate to science) drives cultural polarization on science issues>


=={ 2. large affect is apparent in such social cases but not in science issues - such that dispassionate judgement abilities may be reduced }==

I'm confused by that statement - perhaps I'm not sure what you're meaning by "affect" there. Are you saying that an affect of insecurity is more apparent in social cases than it is in science issues...meaning that dispassionate judgement abilities are reduced for science issues?


=={ 3. science is an attempt to remove bias and direct itself towards objective truths - how this is understood could impact judgement of scientific results }==

How does that play out w/r/t the question of whether something intrinsic to science drives polarization on science-related issues?

June 5, 2017 | Unregistered CommenterJoshua

Joshua,

I'm suggesting with these 3 differences that I would expect less polarization on scientific results than on social situation cases. And, as a result, the scientific result divide should be easier to remedy. Perhaps much easier. I gather from Dan's comments that he thinks any research that attempts to solve the scientific divide will fail if it doesn't consider that divide to be merely a symptom of the wider ranging social divide. You seem to think the same. I'm not so sure. I think the focus in the scientific cases is on what cognitive processes are preventing what is already a substantial attempt (due to these differences) to establish an objective judgment all can agree with from doing so. That's different from not making any attempt at objectivity and leaving the complete judgment up to the subjects. I'm not saying that there is something intrinsic to science driving the polarization - exactly the opposite - I'm saying that there is a great deal intrinsic to science that should prevent, or at least substantially lower, polarization.

On 1: It might be the case that the folk belief in their own competence on social situation judgments vs. scientific result judgments is such that they are more inclined to base those social judgments on intuitive non-reflective (system 1) cognition. With scientific results, I would expect much reliance on system 2. That for example shows up in Dan's classic result with high OSI individuals dividing more on science than low OSI ones. Would you expect high emotional intelligence individuals to divide more than low emotional intelligence ones in the social cases?

On 2: the affect I'm referring to is in the experience of watching the film. Viewers are going to see lots of emotional content and in real time (subject to the situation cue prior to the film) decide who they identify with vs. oppose in the film. That emotional content combined with the identification is likely to skew their subconscious selection of what events to commit to memory, as well as the severity of events they recall.

On 3: The scientific result cases are judgments of someone else's - an expert's - analysis, and where the scientific model of analysis should indicate that an attempt has been made to bypass any experimenter biases.

Let's formulate a test of the social case that's closer to the scientific cases we've discussed. Suppose that subjects are given reports by judges (social case experts) that evaluate potential criminal activity at the abortion clinic or recruitment center. Instead of viewing the films themselves, they read comments and evaluations by those judges (that viewed the films) that list the events and evaluate the behaviors of people. This might not completely address the above differences (judges might be viewed as more politically motivated than scientists, the situations are still social where people feel more confident in their own judgments, etc.), but I would expect a smaller identity-protecting effect than in the direct film viewing test.

Science is perhaps the institution where one would expect the least impact from an identity-protecting effect. Suggesting that one can't hope to solve the remaining identity-protective effect in science without addressing the wider one in subjective social situations seems to me to be rather defeatist.

June 5, 2017 | Unregistered CommenterJonathan

@Joshua-- I would regard any communication that involves presenting empirical evidence as within bounds of "science communication." Accordingly, gun control debate is pervaded by "science communication" issues, as is debate over consequences of Obamacare

What people see protestors doing in a video does not present any science communication issue.

June 5, 2017 | Registered CommenterDan Kahan

@JImClark--

The protest study alludes to the 1950s study you had in mind, which was entitled "They saw a game."

For "saw a game" in science communciation, consider, e.g., Kahan, D.M., Jenkins-Smith, H. & Braman, D. Cultural Cognition of Scientific Consensus. J. Risk Res. 14, 147-174 (2011). "

June 5, 2017 | Registered CommenterDan Kahan

Jonathan -

Let me try to unpack that a bit...


=={ I think the focus in the scientific cases is on what cognitive processes are preventing what is already a substantial attempt (due to these differences) to establish an objective judgment all can agree with from doing so. That's different from not making any attempt at objectivity and leaving the complete judgment up to the subjects. }==

I'm a bit confused about who are the actors in those two scenarios. Who is doing the focusing in the scientific cases? Is that in contrast to some other who that is not making an attempt at objectivity in the "social cases?"


=={ On 1: It might be the case that the folk belief in their own competence on social situation judgments vs. scientific result judgments is such that they are more inclined to base those social judgments on intuitive non-reflective (system 1) cognition. With scientific results, I would expect much reliance on system 2. }==

Do you think that there is evidence that the average person approaches scientific questions in a manner that is more reflective than their approach other (social) evidence-based questions? I would question that. In fact, I have often found with students that when they think that they're starting to approach questions where scary science-associated reasoning might be involved (scary if they don't view themselves as particularly capable in "doing science"), they are more inclined to shut down their analytical processes and reach reflexively towards more intuitive forms of opinion-formation.

=={ Would you expect high emotional intelligence individuals to divide more than low emotional intelligence ones in the social cases? }==

Maybe. I have always been somewhat curious about Dan's confident conclusions of causality, and direction of causality, when examining the positive correlation between "science knowledge" or "cognitive reasoning" and polarization on science-related issues. IMO, it may well be that people who are more inclined towards polarization are those who develop the science knowledge and reasoning skills that he is measuring with a very specific set of metrics. I am less inclined to think what he's measuring is reflective of some domain-independent attributes (i.e., that some people have higher-functioning cognitive skills across all domains as opposed to higher functioning cognitive skills within particular domains.

Let's imagine taking artists who feel very competent in their ability to assess art...wouldn't they be more polarized in their assessments of the quality of art? How about talented basketball players, might they not be more polarized in their views as to whether Lebron or Kobe was a better player? Musicians on the genius of Bob Marley or Bob Dylan?

=={ On 2: the affect I'm referring to is in the experience of watching the film. Viewers are going to see lots of emotional content and in real time (subject to the situation cue prior to the film) decide who they identify with vs. oppose in the film. That emotional content combined with the identification is likely to skew their subconscious selection of what events to commit to memory, as well as the severity of events they recall. }==

I would certainly agree with that, but I'm a bit lost as to how that ties back to why motivated reasoning in science-related questions is not directly related to underlying aspects of how people are "motivated" more generally in assessing situations in line with their ideological orientation.

=={ On 3: The scientific result cases are judgments of someone else's - an expert's - analysis, and where the scientific model of analysis should indicate that an attempt has been made to bypass any experimenter biases. }==

But the judgement of the expert's expertise rests with the non-expert. They have no way to assess expertise independent of their own "motivated" processes.

=={ Suppose that subjects are given reports by judges (social case experts) that evaluate potential criminal activity at the abortion clinic or recruitment center. Instead of viewing the films themselves, they read comments and evaluations by those judges (that viewed the films) that list the events and evaluate the behaviors of people. This might not completely address the above differences (judges might be viewed as more politically motivated than scientists, the situations are still social where people feel more confident in their own judgments, etc.), but I would expect a smaller identity-protecting effect than in the direct film viewing test. }==

I would also. It is certainly a relevant question, IMO. But again, I'm not clear how that makes the case that there is something intrinsically different about the proclivity towards motivation when evaluating evidence-based scientific communication in comparison to evidence-based communication in non-science related questions.

Sorry for being so obtuse.

=={ Science is perhaps the institution where one would expect the least impact from an identity-protecting effect. }==

Not sure I understand, but I would say that it is likely that due to the application of the scientific method, it is likely that scientists engaged in science activities are likely to manifest less "motivation"-biased reasoning than non-scientists engaged in reasoning where the scientific method is not explicitly applied, or even scientists engaged in reasoning where the scientific method is not explicitly applied. Nonetheless, i don't see anything about the communication of evidence-based information as a part of opinion formation in a science realm is less likely to induce "motivated" opinion formation than communication of evidence-based information in a non-science related realm.

=={ Suggesting that one can't hope to solve the remaining identity-protective effect in science without addressing the wider one in subjective social situations seems to me to be rather defeatist. }==

I'm still largely confused as to exactly what we're talking about....but it seems to me that I'm talking about the communication of evidence in the process of opinion formation in different realms (science vs. non-science) and you're talking about the engagement in scientific method-based reasoning (a process that can apply to non-scientific realms ). I am not defeatist about the positive effects of the scientific method in reducing the biasing influence of identity-associate "motivations" (although I certainly don't see applying the scientific method as some sort of easy inoculation).

Does anything I wrote there make any sense? (and sorry for the long comment, I agree that less is more in blog comment writing, although I'm terrible at applying that rule of thumb). Obviously, feel free to only address one small portion of what I wrote (if anything at all).

June 5, 2017 | Unregistered CommenterJoshua

Dan -

I guess I don't understand what criteria you use to determine a binary distinction between "science communication" and non-science communication.

June 5, 2017 | Unregistered CommenterJoshua

@Joshua--

Nor do I understand yours.

This is science communication. The NAS report conveys information derived by science's distinctive method of disciplined observation, measurement, and inference.

People relating what they gleaned by brute sense impression is not science communication; science's distinctive methods have nothing to do with the information conveyed.

Pretty simple, no?

June 6, 2017 | Registered CommenterDan Kahan

@Jonathan--

Lay people's factual judgments are guided by culture-infused affective reactions--whether they are forming assessments of a movie of protestors (or football referee's calls) or making judgments about a science study on risks of climate change, on deterrent effects of concealed-carry laws, or on safety of nuclear waste disposal.

June 6, 2017 | Registered CommenterDan Kahan

Dan -

=={ Pretty simple, no? }==

Perhaps so for normal (or smart) people, but it doesn't seem that simple to me.

For example:

You say... Accordingly, gun control debate is pervaded by "science communication" issues, as is debate over consequences of Obamacare.

And I see something like this:

"[t]o me, while reading that Section of the bill, it became so evident that there would be a panel of bureaucrats who would decide on levels of health care, decide on those who are worthy or not worthy of receiving some government-controlled coverage ... Since health care would have to be rationed if it were promised to everyone, it would therefore lead to harm for many individuals not able to receive the government care. That leads, of course, to death." ...

or this:

"... spoke a lot about the rationing of care that was going to be a part of Obamacare, and, you know, I was about laughed out of town for bringing to light what I call death panels, because there's going to be faceless bureaucrats who will—based on cost analysis and some subjective idea on somebody's level of productivity in life—somebody is going to call the shots as to whether your loved one will be able to receive health care or not. To me, death panel. I called it like I saw it, and people didn't like it."

And I see someone who believes that she is engaged in weighing evidence in an analytical process. I would guess that Sarah thinks that she is conveying information derived by distinctive methods of disciplined observation, measurement, and inference. Now I might not really have much confidence in the quality of her evidence-evaluation, but I kind of get the impression that you wouldn't characterize Sarah as engaging in "science communication" there and I don't see hard line between what she's engaged in and what you are considering to be "science communication."

I get that it might seem that I'm doing something akin to counting angels on the head of a...but I really am confused about how you differentiate "science communication" from evidence-based communication in any variety of contexts that I would guess you would not call "science communication."

Do you suppose if you did an fMRI of Sarah's brain when she's talking about "death panels" you'd show engagement in different regions of the brain compared to when a physicist is discussing string theory?

And I'm not just trying to argue by contrasting with absurd examples from cartoonish characters. For example - how about this?: Would that be "science communication" If so, and Sarah's death panels aren't, then what is the objectively defined difference?

June 6, 2017 | Unregistered CommenterJoshua

sorry... inextricable...not intractable.

June 6, 2017 | Unregistered CommenterJoshua

oops. Wrong thread on that last one.

June 6, 2017 | Unregistered CommenterJoshua

Dan and others -

I realize that the definition of "science communication" seems simple to Dan , but I was hoping for more clarification to help me understand... it looks like it isn't forthcoming from Dan.

So I'll appeal to smart readers who might stumble by.

I still don't understand what a coherent definition of "science communication" is...particularly in such a way as to lend insight as how it might be associated with a somewhat unique mode of reasoning.

For the example I gave above, it seems to me that Sarah is in evaluating evidence in order to reach conclusions, and communicating about her conclusions. Is that "science communication?"

Even if other people might find her process of evaluating evidence to be poor, it seems to me is that what matters is her conceptualization of what she is doing. If we expect to describe attributes associated with her reasoning, it would seem to me that what matters is her view of the process she's engaged in.

So if she is engaged in "science communication" there, then what might distinguish her "communication" there from any other person communicating opinions related to an issue of controversy?

Or maybe Sarah isn't engaged in "science communication" there - in which case I'm hoping someone could lay out a clear set of criteria which spells out the differences in what she's engaged in and what scientists are engaged in when they, discuss, say, whether or not GMO are harmful.

What makes this particularly interesting to me is that, IMO, there was a recent example of Sarah displaying a tendency towards controlling for "motivated reasoning" when her response to Trump's "crony capitalism" is contrasted with the responses of many other leading Republicans in the whole Carrier situation.

June 9, 2017 | Unregistered CommenterJoshua

PostPost a New Comment

Enter your information below to add a new comment.

My response is on my own website »
Author Email (optional):
Author URL (optional):
Post:
 
Some HTML allowed: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <code> <em> <i> <strike> <strong>