follow CCP

Recent blog entries
popular papers

What Is the "Science of Science Communication"?

Climate-Science Communication and the Measurement Problem

Ideology, Motivated Cognition, and Cognitive Reflection: An Experimental Study

'Ideology' or 'Situation Sense'? An Experimental Investigation of Motivated Reasoning and Professional Judgment

A Risky Science Communication Environment for Vaccines

Motivated Numeracy and Enlightened Self-Government

Ideology, Motivated Cognition, and Cognitive Reflection: An Experimental Study

Making Climate Science Communication Evidence-based—All the Way Down 

Neutral Principles, Motivated Cognition, and Some Problems for Constitutional Law 

Cultural Cognition of Scientific Consensus

The Tragedy of the Risk-Perception Commons: Science Literacy and Climate Change

"They Saw a Protest": Cognitive Illiberalism and the Speech-Conduct Distinction 

Geoengineering and the Science Communication Environment: a Cross-Cultural Experiment

Fixing the Communications Failure

Why We Are Poles Apart on Climate Change

The Cognitively Illiberal State 

Who Fears the HPV Vaccine, Who Doesn't, and Why? An Experimental Study

Cultural Cognition of the Risks and Benefits of Nanotechnology

Whose Eyes Are You Going to Believe? An Empirical Examination of Scott v. Harris

Cultural Cognition and Public Policy

Culture, Cognition, and Consent: Who Perceives What, and Why, in "Acquaintance Rape" Cases

Culture and Identity-Protective Cognition: Explaining the White Male Effect

Fear of Democracy: A Cultural Evaluation of Sunstein on Risk

Cultural Cognition as a Conception of the Cultural Theory of Risk

« MAPKIA episode #939: What does the Pew "Malthusian Worldview" item predict?! | Main | "Don't jump"--weekend reading: Do judges, loan officers, and baseball umpires suffer from the "gambler's fallacy"? »

"Enough already" vs. "can I have some more, pls": science comprehension & cultural polarization

The motivation for this post is to respond to commentators--@Joshua & @HalMorris—who wonder, reasonably, whether there’s really much point in continuing to examine the relationship between cultural cognition & like mechanisms, on the one hand, and one or another element of science comprehension (cognitive reflection, numeracy, “knowledge of basic facts,” etc).

They acknowledge that evidence that cultural polarization grows in step with proficiency in critical reasoning is useful for, say, discrediting positions like the “knowledge deficit” theory (the view that public conflicts over policy-relevant science are a consequence of public unfamiliarity with the relevant evidence) and the “asymmetry thesis” (the positon that attributes such conflicts to forms of dogmatic thinking distinctive of “right wing” ideology.

But haven’t all those who are amenable to being persuaded by evidence on these points gotten the message by now, they ask?

I agree that the persistence of the “knowledge deficit” view and to a lesser extent the “asymmetry thesis” (which I do think is weakly supported but not nearly so unworthy of being entertained as “knowledge deficit” arguments) likely don’t justify sustained efforts at this point to probe the relationship between cultural cognition and critical reasoning.

But I disagree that those are the only reasons for continuing with—indeed, intensifying—such research.

On the contrary, I think focusing on science comprehension is critical to understanding cultural cognition; to forming an accurate moral assessment of it; and to identifying appropriate responses for managing its potential to interfere with free and reasoning citizens’ attainment of their ends, both individual and collective (Kahan 2015a, 2015b).

I should work more systematically how to convey the basis of this conviction.

But for now, consider these “two conceptions” of cultural cognition and rationality. Maybe doing so will foreshadow the more complete account—or better still, provoke you into helping me to work this issue out in a way that satisfies us both.

1. Cultural cognition as bounded rationality. Persistent public conflict over societal risks (e.g., climate change, nuclear waste disposal, private gun possession, HPV immunization of schoolgirls, etc.) is frequently attributed to overreliance on heuristic, “System 1” as opposed to conscious, effortful “System 2” information processing (e.g., Weber 2006; Sunstein 2005). But in fact, the dynamics that make up the standard “bounded rationality” menagerie—from the “availability effect” to “base rate neglect,” from the “affect heuristic” to the “conjunction fallacy”—apply to people of all manner of political predispositions, and thus don’t on their own cogently explain the most salient feature of public conflicts over societal risks: that people are not simply “confused” about the facts on these issues but systematically divided on them on political grounds.

One account of cultural cognition views it as the dynamic that transforms the mechanisms of “bounded rationality” into fonts of political polarization (Kahan, Slovic, Braman, & Gastil 2006 Kahan 2012). Cultural predispositions thus determine the valence of the sensibilities that govern information processing according in the manner contemplated by the “affect heuristic” (Peters, Burraston & Mertz 2004; Slovic & Peters 1996). The same for the “availability effect”: the stake individuals have in forming “beliefs” that express and reinforce their connection to cultural groups determines what sorts of risk-relevant facts they notice, what significance to them, and how readily they recall them; (Kahan, Jenkins-Smith & Braman 2011). The motivation to form identity-congruent beliefs drives biased search and biased assimilation of information (Kahan, Braman, Cohen, Gastil & Slovic 2010)..—not only on existing contested issues but on novel ones (Kahan, Braman, Slovic, Gastil & Slovic 2009).  

2. Cultural cognition as expressive rationality. Recent scholarship on cultural cognition, however, seems to complicate if not in fact contradict this account!

By treating politically motivated reasoning—of which “cultural cognition” is one operationalization (Kahan in pressb)—as in effect a “moderator” of other more familiar cognitive biases, the “bounded rationality” conception of it implies that cultural cognition is a consequence of over-reliance on heuristic information processing (e.g., Taber & Lodge 2013; Sunstein 2006). If this understanding is correct, then we should expect cultural cognition to be mitigated by proficiency in the sorts of reasoning dispositions essential to conscious, effortful “System 2” information processing.

But in fact, a growing body of evidence suggests that System 2 reasoning dispositions magnify rather than reduce cultural cognition! Experiments show that individuals high in cognitive reflection and numeracy use their distinctive proficiencies to discern what the significance of crediting complex information is for positions associated with their cultural or political identities (Kahan 2013; Kahan, Peters, Dawson & Slovic 2013).

As a result, they more consistently credit information that is in fact identity-affirming and discount information that is identity-threatening. If this is how individuals reason outside of lab conditions, then we should expect to see that individuals highest in the capacities and dispositions necessary to make sense of quantitative information should be the most politically polarized on facts that have become invested with identity-defining significance. And we do see that—on climate change, nuclear power, gun control, and other issues (Kahan 2015; Kahan, Peters, et al., 2012).

This work supports an alternative “expressive” conception of cultural cognition. On this account, cultural cognition is not a consequence of “bounded rationality.” It is a form of engaging information rationally suited for forming affective dispositions that reliably express their group allegiances (cf. Lessig 1995; Akerlof & Kranton 2000).

“Expressing group allegiances” is not just one thing ordinary people do with information on societally contested risks. It is pretty much the only thing they do. The personal “beliefs” ordinary people form on issues like climate change or gun control or nuclear power etc. don’t otherwise have any impact on them. Ordinary individuals just don’t matter enough, as individuals, for anything they do based on their view of the facts on these issues to affect the level of risk they are exposed to or the policies that get adopted to abate them (Kahan 2013, in press). In contrast, it is in fact critical to ordinary people’s well-being—psychic, emotional, and material—to evince attitudes that convey their commitment to their identity-defining groups in the myriad everyday settings in which they can be confident those around them will be assessing their character in this way (Kahan in pressb).

* * * * *

At one point I thought the first conception of cultural cognition was right. Indeed, it didn’t even occur to me, early on, that the second conception existed!

But now I believe the second view is almost certainly right. And that no account that fails to recognize that cultural cognition is integral to individual rationality can possibly make sense of it or manage successfully the influences that create the conflict between expressive rationality and collective rationality that give rise to cultural polarization over policy-relevant facts.

If that’s right, then in fact the continued focus on the interaction of cultural cognition and critical reasoning proficiencies will remain essential.

So is it right? Maybe not; but the only way to figure that out also is to keep probing this interaction.


Akerlof, G. A., & Kranton, R. E. (2000). Economics and Identity. Quarterly Journal of Economics, 115(3), 715-753.

Kahan, D. M. (2015b). What is the “science of science communication”? J. Sci. Comm., 14(3), 1-12.

Kahan, D. M., Jenkins-Smith, H., & Braman, D. (2011). Cultural Cognition of Scientific Consensus. J. Risk Res., 14, 147-174.

Kahan, D. M., Peters, E., Dawson, E., & Slovic, P. (2013). Motivated Numeracy and Enlightened Self Government. Cultural Cognition Project Working Paper No. 116.

Kahan, D. M., Peters, E., Wittlin, M., Slovic, P., Ouellette, L. L., Braman, D., & Mandel, G. (2012). The polarizing impact of science literacy and numeracy on perceived climate change risks. Nature Climate Change, 2, 732-735.

Kahan, D. M.. Ideology, Motivated Reasoning, and Cognitive Reflection. Judgment and Decision Making, 8, 407-424 (2013).

Kahan, D., Braman, D., Cohen, G., Gastil, J. & Slovic, P. Who Fears the HPV Vaccine, Who Doesn’t, and Why? An Experimental Study of the Mechanisms of Cultural Cognition. Law Human Behav 34, 501-516 (2010).

Kahan, D.M. Climate-Science Communication and the Measurement Problem. Advances in Political Psychology 36, 1-43 (2015a).

Kahan, D.M. Cultural Cognition as a Conception of the Cultural Theory of Risk. in Handbook of Risk Theory: Epistemology, Decision Theory, Ethics and Social Implications of Risk (ed. R. Hillerbrand, P. Sandin, S. Roeser & M. Peterson) 725-760 (Springer London, Limited, 2012).

Kahan, D.M. The expressive rationality of inaccurate perceptions of fact. Brain & Behav. Sci. (in press_a).

Kahan, D.M. The Politically Motivated Reasoning Paradigm. Emerging Trends in Social & Behavioral Sciences (in press_b).

Kahan, D.M., Braman, D., Slovic, P., Gastil, J. & Cohen, G. Cultural Cognition of the Risks and Benefits of Nanotechnology. Nature Nanotechnology 4, 87-91 (2009).

Kahan, D.M., Slovic, P., Braman, D. & Gastil, J. Fear of Democracy: A Cultural Evaluation of Sunstein on Risk. Harvard Law Review 119, 1071-1109 (2006).

Lessig, L. (1995). The Regulation of Social Meaning. U. Chi. L. Rev., 62, 943-1045.

Lodge, M., & Taber, C. S. (2013). The rationalizing voter. Cambridge ; New York: Cambridge University Press.

Peters, E.M., Burraston, B. & Mertz, C.K. An Emotion-Based Model of Risk Perception and Stigma Susceptibility: Cognitive Appraisals of Emotion, Affective Reactivity, Worldviews, and Risk Perceptions in the Generation of Technological Stigma. Risk Analysis 24, 1349-1367 (2004).

Slovic, P. & Peters, E. The importance of worldviews in risk perception Risk Decision and Policy 3, 165-170 (1998).

Sunstein, C. R. (2006). Misfearing: A reply. Harvard Law Review, 119(4), 1110-1125.

Weber, E. Experience-Based and Description-Based Perceptions of Long-Term Risk: Why Global Warming does not Scare us (Yet). Climatic Change 77, 103-120 (2006).

PrintView Printer Friendly Version

EmailEmail Article to Friend

Reader Comments (10)

Dan, since you've ranged so widely and boldly from constitutional law to challenging the likes of Kahneman and Tversky (rightly and I applaud it) on the completeness of their taxonomies of error, maybe you'll consider how much of this failure to converge is due to the prevalence of very systematic and amazingly thorough disinformation, some of it "below the radar" -- i.e. escaping the notice of academia and the MSM (Mostly Sane Media). E.g. I wrote a piece which begins

Thursday, July 3, 2014
Myths About Saul Alinsky (and Obama)
Lately, right wing sources have been circulating a fictitious set of 8 "levels of control" or "How to create a social state" that Saul Alinsky was supposed to have written, which led off with
1) Healthcare – Control healthcare and you control the people

This myth is so thoroughly digested and accepted that will spit it out as the answer to "What are the 8 levels of control as outlined by Saul Alinsky?" (last time checked: 2014-08-18).
Yet is very easily shown to be a total fiction.

Much of this serves a double purpose, namely to spread ideas conducive to certain political behavior; but also once they are widely believed and discussed among a knowing cognoscenti they seem to discredit the MSM, academia, liberal politicians, etc. -- i.e. people who never mention these very important facts, and would call them lies if anyone mentioned them (since they are lies).

I've called this a "Corollary of the Big Lie principal". The bigger the lie the more it seems damning of the MSM who are desperately working to cover such things up!!

Between smallish blogs, perhaps as lightly trafficked as yours (but less well informed) which become "club houses" for like minded people, and anonymous email, there is great trafficking in lies so obviously false that Rush Limbaugh won't touch them (but no doubt knows about them, and does more or less state the conclusions they point to -- i.e. "Obama isn't like one of us"), yet I'd wager easily 10-20% of the U.S. population is highly exposed to them and takes them seriously. The anonymous email channel is explored in
11/2011 "My Not-really-right-wing Mom and her adventures in Email-Land"

There are things that academia typically won't touch because the don't seem weighty enough. E.g. as a very serious amateur scholar of early 19c American history, I achieved some minor fame, purely among academics by finding a niche of information helpful to getting insights even for such scholars, but which just wouldn't do for a journal article - scanning chapter length excerpts from forgotten printed sources of the time, and adding a bit of commentary, in the form of an email newsletter in the 90s.

You seem to be the sort of person who will go places where journal articles don't necessarily follow.

One thing I think could help deal with all this is if anyone would track week-to-week just how much credulity a certain absurdity (Obama is a Muslim; safety=more guns, ...) and seeing whether there is correlation with what's being circulated "below the radar". The 2nd and 3rd tier blogs are perfectly out in the open, and quite willing to spread easily debunkable lies. The thing is it calls for polling from week to week on the same questions, not, I think, the sort of thing pollsters are likely to do unless for real time trouble shooting on behalf of some presidential campaign.

I've thought about tapping into Mechanical Turk myself for such a project but it will probably still be beyond my means.

January 3, 2016 | Unregistered CommenterHal Morris

My prev. comment wasn't really "Enough already!". I'm just passionate and restless about this, and when I see someone poking at it in a creative way I want to explore the possibilities.

You've gone at it from so many different angles, so I have a long way to go to get a sense of all the work of Cultural Cognition and attempt to engage with it.

January 4, 2016 | Unregistered CommenterHal Morris

@HalMorris-- don't worry! It would be fine even if that is what you had meant.
Don't expect all the angles to all add up. The way I see things today isn't the way I've always seen them (& I'll be pretty disappointed if after any significant period of time I still see things as I do now)

Your proposed project is a good one. But one thing that is complicated is to figure out what it *means* for someone to say they "believe" Obama is a Muslim or was born in Kenya etc. I might be wrong, but I think figuring out what they mean by that is *easier* than figuring out what they mean when they say they don't believe in climate change. Or do, for that matter.

A lot of what looks like "misinformation," I suspect, isn't that at all; it's the investment of propositions of fact with *social meanings* -- creating a world in which people have a reason to form "beliefs" of one sort or another in order to *accurately* convey who they are & whose side they are on in the competition for social status among opposing cultural groups.

That doesn't make the "communicative activity" in question any less evil... Indeed, if all those engage in this sort of activity were doing were confusing people on facts, they'd be harmless (people know very little about the 'facts' on all sorts of matters on which they appropriately align themselves w/ what's known by science)

January 4, 2016 | Registered CommenterDan Kahan

Effective propagandists have had a working knowledge of identity protective cognition maybe since prehistory.

Your overdetermined reasoning and call for aporetic and conciliatory reasoning from the bench are very good in some circumstances. But then, what did Rwandan Hutus think it meant if they assented to calling Tutsis "cockroaches", etc? What about the Hutu who made an exception of every particular Tutsi he knew? What it means may be somewhat fluid until a social movement passes a tipping point, at which point a major part of the "meaning" is the murders that the mob is carrying out.

What does it mean that you want to know the meaning? Can you operationalize that? I mean it as a serious question.

January 4, 2016 | Unregistered CommenterHal Morris

In another forum, an article starts with "What is above all needed is to let the meaning choose the word, and not the other way around. -George Orwell"

Much of it is about words/phrases that one cannot really use unless you believe a certain way; thus we are divided not just by belief, but linguistically.

January 4, 2016 | Unregistered CommenterHal Morris

I think a joke about Israelis...

A North Korean, an Ethiopian, an Australian, an American, and an Israeli are having a discussion while attending an international conference, and a reporter comes up to them and asks:

Excuse me, but what's your opinion about the meat shortage in third world countries?

The North Korean responds with "What's an opinion?" The Etheopian asks "What's meat?" Australian asks "What's a meat shortage?" The American responds with "What's the 3rd wold?." And the Israeli responds with "What's excuse me?"

I refer to that joke because I come from a culture where vociferous polarization was very much a common part of normal interaction (something that turned out to be an "interesting" foundational attribute when I moved to and worked in Yankee/WASP predominated New England), and which serves as a motivation to seek out evidence and information to reinforce polarized views.

In that culture, increased knowledge about any variety of topics was associated with greater drive to develop "literacy" in scientific as well as other areas, as well as associated with a greater tendency to feel comfortable with conflict, as well as a tendency to feel comfortable with seeking out evidence to reinforce identity formation in association with specific political, ideological, and cultural orientation.

In other words, in my experience, an association between greater knowledge/advanced education in a variety of areas, including science, was not so much because a stronger knowledge base reinforced a sense of identity (and accordingly, identity polarization with those who had different views) but because a cultural attribute reinforced the tendency to use evidence gathering as a way to further reinforce a sense of identity.

This is something that I've had trouble with pretty much whenever I read you write, what seems to me, to be a description of a mechanism whereby (you theorize?) greater scientific literacy causes greater polarization. What I have a problem with is that you seem to me to be describing not a cross-sectional, fixed in time static association (which is supported by your data) but a longitudinal, causal mechanism (which isn't supported by your data). So then, I am left wondering about the causality, the direction of causality, and whether/what various factors might play a mediating and/or moderating role.

I also think of the highly educated Japanese that I've worked with who have always struck me as having surprisingly light political/ideological orientation, in comparison to my experience with highly educated Americans. Why is it, if my anecdotal observations would pan out empirically, that an association between education and ideological polarization is stronger among Americans than among Japanese?

I also think of how people who are knowledgeable about basketball may be likely to be people who play more basketball, or at least are more interested in basketball, and thus more highly polarized in their opinions about whether Wilt ranks higher on the all-time basketball player list than Jordan or Magic or Russell or Bird....

I also think about the evidence that "skeptics" have slightly more sophisticated knowledge about climate change than "realists." Of course, "skeptics" like to argue that the reason for that pattern is that the more someone knows about climate change, the more likely they are to be "skeptical" (while amusingly/conveniently ignoring the evidence that suggests that an association between ideological predisposition and views on climate change is a much stronger predictor of views on climate change than level of knowledge). But I have to wonder if the reason for that pattern isn't because people with an ideological orientation that predisposes them to be "skeptical" about climate change don't have more motivation to research evidence (and actively filter that evidence so as to reinforce their biases) than "realists" who are inclined to accept the prevalent view among "experts"

So all of that is obviously mere anecdotal reasoning, but it is also part of the reason why, while I think that the association you describe between scientific literacy and polarization on climate change is interesting, I am dubious about how far exploring that pattern of association will go in explaining the causality behind motivated reasoning, and I am also quite convinced that the existence of that pattern of association is very small in magnitude relative to the general tendency of humans to filter evidence so as to reinforce their identity orientation (even if that tendency is probably governed, at least to some degree, by cultural norms).

January 5, 2016 | Unregistered CommenterJoshua

Dan -

Interesting, and somewhat related:

The deficit-model of science communication assumes increased communication about
science issues will move public consensus toward scientific consensus. However, in the
case of climate change, public polarization about the issue has increased in recent years,
not diminished. In this study, we draw from theories of motivated reasoning, social identity,
and persuasion to examine how science-based messages may increase public polarization
on controversial science issues such as climate change. Exposing 240 adults to simulated
news stories about possible climate change health impacts on different groups, we found
the influence of identification with potential victims was contingent on participants’ political
partisanship. This partisanship increased the degree of political polarization on support
for climate mitigation policies and resulted in a boomerang effect among Republican
participants. Implications for understanding the role of motivated reasoning within the
context of science communication are discussed.


n the experiment, research subjects from upstate New York read news articles about how climate change might increase the spread of West Nile Virus, which were accompanied by the pictures of the faces of farmers who might be affected. But in one case, the people were said to be farmers in upstate New York (in other words, victims who were quite socially similar to the research subjects); in the other, they were described as farmers from either Georgia or from France (much more distant victims). The intent of the article was to raise concern about the health consequences of climate change, but when Republicans read the article about the more distant farmers, their support for action on climate change decreased, a pattern that was stronger as their Republican partisanship increased. (When Republicans read about the proximate, New York farmers, there was no boomerang effect, but they did not become more supportive of climate action either.)

January 10, 2016 | Unregistered CommenterJoshua

@Joshua-- huh. thanks. Embarrased to say not familiar w/ it but compensated by happiness that you've pulled me free of the snapping jaws of entropy.
So what's your view -- does it satisfy the conditions for valid proof of politically motivated reasoning as set forth in "The Politically Motivated REasoning Paradigm"?

January 10, 2016 | Registered CommenterDan Kahan

Dan -

My gut feeling is no (I'm dubious about results that find asymmetry), but I haven't really looked at it very closely. I'll try to find the time, as I'm curious about the answer to your question...

January 11, 2016 | Unregistered CommenterJoshua

And this is cute:

3. Obama is a Muslim! And if that's still not enough, yet another Nyhan and Reifler study examined the persistence of the "President Obama is a Muslim" myth. In this case, respondents watched a video of President Obama denying that he is a Muslim or even stating affirmatively, "I am a Christian." Once again, the correction—uttered in this case by the president himself—often backfired in the study, making belief in the falsehood that Obama is a Muslim worse among certain study participants. What's more, the backfire effect was particularly notable when the researchers administering the study were white. When they were nonwhite, subjects were more willing to change their minds, an effect the researchers explained by noting that "social desirability concerns may affect how respondents behave when asked about sensitive topics." In other words, in the company of someone from a different race than their own, people tend to shift their responses based upon what they think that person's worldview might be.

January 11, 2016 | Unregistered CommenterJoshua

PostPost a New Comment

Enter your information below to add a new comment.

My response is on my own website »
Author Email (optional):
Author URL (optional):
Some HTML allowed: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <code> <em> <i> <strike> <strong>