follow CCP

Recent blog entries
popular papers

What Is the "Science of Science Communication"?

Climate-Science Communication and the Measurement Problem

Ideology, Motivated Cognition, and Cognitive Reflection: An Experimental Study

'Ideology' or 'Situation Sense'? An Experimental Investigation of Motivated Reasoning and Professional Judgment

A Risky Science Communication Environment for Vaccines

Motivated Numeracy and Enlightened Self-Government

Ideology, Motivated Cognition, and Cognitive Reflection: An Experimental Study

Making Climate Science Communication Evidence-based—All the Way Down 

Neutral Principles, Motivated Cognition, and Some Problems for Constitutional Law 

Cultural Cognition of Scientific Consensus
 

The Tragedy of the Risk-Perception Commons: Science Literacy and Climate Change

"They Saw a Protest": Cognitive Illiberalism and the Speech-Conduct Distinction 

Geoengineering and the Science Communication Environment: a Cross-Cultural Experiment

Fixing the Communications Failure

Why We Are Poles Apart on Climate Change

The Cognitively Illiberal State 

Who Fears the HPV Vaccine, Who Doesn't, and Why? An Experimental Study

Cultural Cognition of the Risks and Benefits of Nanotechnology

Whose Eyes Are You Going to Believe? An Empirical Examination of Scott v. Harris

Cultural Cognition and Public Policy

Culture, Cognition, and Consent: Who Perceives What, and Why, in "Acquaintance Rape" Cases

Culture and Identity-Protective Cognition: Explaining the White Male Effect

Fear of Democracy: A Cultural Evaluation of Sunstein on Risk

Cultural Cognition as a Conception of the Cultural Theory of Risk

« Cultural cognition of weather: a cool (or warm) guest post! | Main | Science curiosity & politically biased information processing--the (inevitable!) OCTUSW response »
Wednesday
Aug032016

Science curiosity vs. politically motivated reasoning: An experimental steel cage match!

So today I’ll finally tell you what we did in the information-seeking experiment featured in our new paper “Science Curiosity and Political Information Processing.”

It was pretty darn simple.

We assigned subjects to one of two conditions. In each, subjects were presented with two news story headlines: a “climate realist” one, which announced that scientists had uncovered evidence consistent with human-caused climate change; and a “climate skeptical” one, which announced that scientists had uncovered evidence that qualified or called into question the human contribution to climate change.

The difference in the conditions concerned the relative novelty of the opposing pieces of scientific evidence being featured in the respective headlines.

Thus, in Condition 1—“Realist unsurprising, Skeptical surprising”—the  respective newspaper headlines were “Scientists Find Still More Evidence that Global Warming Actually Slowed in Last Decade” and “Scientists Report Surprising evidence: Arctic Ice Melting Even Faster Than Expected.”

In contrast, in Condition 2—“Realist surprising, Skeptical unsurprising” condition—the respective headlines read, “Scientists Report Surprising Evidence: Ice Increasing in Antarctic, Not Currently Contributing To Sea Level Rise” and “Scientists Find Still More Evidence Linking Global Warming to Extreme Weather.”

Subjects were instructed to “pick the story most interesting to you,” and told they’d be asked some questions after they finished reading it.

Aversion to “counterattitudinal” information—that is, information that is contrary to one’s political outlooks—is one of the incidences of politically motivated reasoning.  When given the option, partisans tend to seek out information that is consistent with their predispositions rather than information that is contrary to them (Hart, Albarracín et al. 2009).

That’s exactly what we observed among subjects who were relatively low in science curiosity.

Among subjects who were relatively high in science curiosity, however, we saw the opposite effect.  Thus, relatively right-leaning science-curious subjects—who tended to be climate skeptical—nevertheless preferred the novel or “surprising” realist news story over the unsurprising skeptical story.

Likewise, relatively left-leaning science-curious subjects—who tended to be climate concerned—preferred the surprising skeptical story over the unsurprising realist one.

The effect sizes, moreover, were quite large: moderately science curious subjects were on average 32-percentage points (± 19, LC = 0.95) more likely to select the story that was contrary to their political predispositions than were moderate science incurious ones.

We are motivated to investigate this hypothesis by an unexpected observation in our “science of science filmmaking” studies.  As subjects’ science curiosity increased, their perceptions of contentious risks tended to move in the same direction. Moreover, high-curiosity subjects seemed to resist the normal tendency of individuals to polarize as their proficiency in science comprehension increased. 

We surmised that these individuals might be indulging their appetite for surprise by more readily examining evidence that contravened their political predispositions.  Being exposed to a greater volume of “counterattitudinal data,” they’d form views that were more uniform, and less prone to polarization conditional on science comprehension.

The experiment results supported this hypothesis.

Does this “prove” that science curiosity negates politically motivated reasoning?

No!

It’s a mistake to think empirical evidence ever proves anything

What it does, if it is the product of a valid design, is furnish more reason than one otherwise would have had for crediting one competing account of some phenomenon over another. 

As I explained yesterday, the hypothesis that that science curiosity offsets politically motivated reasoning, is a plausible conjecture—but so is the hypothesis that science curiosity, like other cognitive elements of science comprehension, magnifies this biased form of information processing.

On the scale that registers the strength of the evidence for these respective hypotheses, the experiment result puts an increment of weight down on the side of the first hypothesis.

How much weight?

Well, you can decide that!

But if you are curious for our own views, read the paper: It catalogs our own qualifications and sources of residual uncertainty—and outlines a set of questions for further investigation.

We’re really curious to see if this result stands up to even more critical testing!

References

Hart, W., Albarracín, D., Eagly, A.H., Brechan, I., Lindberg, M.J. & Merrill, L. Feeling validated versus being correct: a meta-analysis of selective exposure to information. Psychological Bulletin 135, 555-588 (2009).

Kahan, D.M., Landrum, A.R., Carpenter, K., Helft, L. & Jamieson, K.H. Science Curiosity and Political Information Processing. Advances in Political Psychology  (in press).

PrintView Printer Friendly Version

EmailEmail Article to Friend

Reader Comments (6)

Dan -

==> Thus, relatively right-leaning science-curious subjects—who tended to be climate skeptical—nevertheless preferred the novel or “surprising” realist news story over the unsurprising skeptical story. ==>

Perhaps part of the reason they preferred that framing is because in saying that the information was "surprising," there is a connotation that the more mainstream/standard view is one that lies in contradiction to the article, i.e., that previous evidence supports a conclusion (in this case that ice melting is relatively slow) which in turn is aligned with their own orientation.

==> Likewise, relatively left-leaning science-curious subjects—who tended to be climate concerned—preferred the surprising skeptical story over the unsurprising realist one. ==>

Perhaps they preferred a story suggesting that Antarctic ice increasing is in contrast to the preexisting science (runs counter to the evidence previously assembled, and quite possibly is an outlier) because it would suggest that their previous view (that Antarctic was decreasing or increasing at a relatively slower rate). In that way, even if their previous view was wrong based on new evidence, it could be argued that they had good, evidence-based reasons for formulating that previous view.

Of course, that doesn't explain the discrepancy you found when comparing the "scientifically curious" to their counterparts.

Out of my own "curiosity," did you test for an association between strength of ideological/partisan identification and the trait you refer to as scientific curiosity?

August 3, 2016 | Unregistered CommenterJoshua

@Joshua-- right: on your theory, one woudln't expect a discrepancy between curious & noncurious, whereas one would if the subjects understood the headlines as we intended (you say "in part," but if in part there were the interpretations or motivations you surmise, it would actually have reduced our effect size on curiosity & counterattitudinal information, so it's not a confound)

There is a small ideology effect for science curiosity, yes. It was featured in previous post, as was the connection between SCS & belief in human-caused climate change.

August 3, 2016 | Registered CommenterDan Kahan

Dan -

==> it would actually have reduced our effect size on curiosity & counterattitudinal information, so it's not a confound ==>

Reduced it from what it would have been otherwise?

I suspect that I'm not as convinced as you about the consistency in how theories and evidence can explain human behavior. Not to disparage your work in formulating those theories and collecting that evidence...

At any rate, I try to think of how I might react as a way of wondering about what might be going on for others...

Were I to read something like..."scientists are surprised to find that the climate isn't warming as quickly as previously predicted..." I think I might respond in a few ways. One might be to immediately doubt the veracity of the claim. Since I can't really evaluate the science, I might look critically at who the scientists were to try to identify ideological markers on their part (to help me in reconciling their view that runs counter to the view that I'm relatively, more strongly identified with).

Another way that I might respond is with a bit of relief. Reading such information might give me a "face-saving" way to reconcile the science that I've determined is more probably correct (without actually being able to evaluate the science...hmmm) with the arguments that I've read that present conflicting evidence. I am given a way forward that doesn't require me to flat out reject that conflicting evidence on the basis of heuristics and probabilities that I'm not entirely confident about. I would have something of a "middle way" forward, where I can say, OK. maybe some of conflicting evidence wasn't complete hokum. Maybe those people presenting that evidence, some of whom didn't exactly seem like complete nutters and some of whom seem to be pretty smart and knowledgeable, were right to some degree, and the scientists I had more confidence in were also right in their interpretation of the previous evidence...but new evidence has come to light, and the scientists I had greater confidence in were open-minded enough and dedicated enough to the scientific method to re-evaluate their views on the basis of evidence. And therefore, my own sense of identity is reinforced in a positive manner: The people I'm more aligned with areopen-minded and dedicated to science, and the people I identify less with may have been somewhat right about the science, but they were wrong and off base in their kooky theories about how the scientists I'm aligned with are either stupid or corrupt or just blinded to the real science because of their "religious faith in AGW" dictated by their ideological biases.

FWIW (and I suspect not much) I'll also note that if you go to someplace like WUWT, you will commonly find much disdain and ridicule pretty much whenever scientists publish a report that the effects or likely impact of ACO2 emissions are "worse than we thought" (meaning, that scientists were surprised to find that they underestimated the magnitude of impact). Likewise, if there are reports that scientists have said something on the order of that they were surprised to find that new information suggest that they previously overestimated the impact of ACO2 emissions, they are similarly likely to respond with disdain on the order of, 'See, the edifice is crumbling, but of course the won't admit it," or "Finally, it's become so obvious that they're wrong that even they have to admit it," or "those scientists must be very brave to face the wrath of the majority of climate scientists who aren't open to evidence and won't admit their errors," etc.

Of course, WUWT fans are likely to be outliers, but I would suggest that they also are likely to be relatively "scientifically curious" people.

And of course, my speculation about how I might react in certain hypothetical circumstances is likely biased in all kinds of ways, as well as not likely to be particularly instructive for generalizing about the behavior of others....

August 4, 2016 | Unregistered CommenterJoshua

@Jooshua--

You are suggesting reasons why partisans in general might want to read surprising information contrary to their presidpositions. In general, partisans don't do that; they engage in biased search for & exposure to, as well as biased assimilation or weighting of, evidence on contested issues.

But if consistent w/ your view there were something about the way we worded the headlines in this study in particular that made partisans more interested than usual in reading surprising contrary information, then our design would have been working at cross-purposes w/ our hypothesis--viz., that only unusually science-curious partisans would be inclined to stifle the normal motivation to engage in confirmatory information exposure. That would be sad for us but it would be the design equivalent of selecting an uphill course to try to run a marathon PR: bad idea but if you still run your PR, all the more impressive.

But I don't myself feel any more impressed w/ the result. I think it is what it is: a case where only curious subjects did what partisans normally don't do-- viz., expose themselves to information contrary to their presispositions.

That said, *as I said* in the post, every empirical study result should be viewed as a solitary Likelihood Ratio in a permanently cycling Bayesian updating process. Everyone reasoning person should decide for him- or herself how much weight to give this result & revise their views accordingly. Then there should be additional studies that variously try to secure the footing of or pull the rug out from this one.

That to me sounds like loads of fun.

August 5, 2016 | Registered CommenterDan Kahan

"You are suggesting reasons why partisans in general might want to read surprising information contrary to their presidpositions. In general, partisans don't do that; they engage in biased search for & exposure to, as well as biased assimilation or weighting of, evidence on contested issues."

I consider myself a partisan, and yet I'm always keen to read surprising information contrary to my predispositions. What am I to make of that?

August 10, 2016 | Unregistered CommenterNiV

@NiV-- There are two possibilities: (a) you are science curious; (b) you are noise in my data. I could estimate the relative likelihood using the data from the paper-- that is, I could estimate the respective likelihoods that a science incurious person would choose to read suprising information contrary to his or her priors & that a science curious one would. It looks like the likelihood ratio is about 3-- that is Pr(ready contrary evidence|science curious)/Pr(read contrary evidence|science incurious) is in the neighborhood of 3. But I can try to figure that out more precisely (of course, how precise an estimate one can make is constrained by the data & admits of estimation too)

One could then compbine that information w/ one's priors about you. Mine are that you are a bit unusual.

August 10, 2016 | Registered CommenterDan Kahan

PostPost a New Comment

Enter your information below to add a new comment.

My response is on my own website »
Author Email (optional):
Author URL (optional):
Post:
 
Some HTML allowed: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <code> <em> <i> <strike> <strong>