follow CCP

Recent blog entries
popular papers

Science Curiosity and Political Information Processing

What Is the "Science of Science Communication"?

Climate-Science Communication and the Measurement Problem

Ideology, Motivated Cognition, and Cognitive Reflection: An Experimental Study

'Ideology' or 'Situation Sense'? An Experimental Investigation of Motivated Reasoning and Professional Judgment

A Risky Science Communication Environment for Vaccines

Motivated Numeracy and Enlightened Self-Government

Making Climate Science Communication Evidence-based—All the Way Down 

Neutral Principles, Motivated Cognition, and Some Problems for Constitutional Law 

Cultural Cognition of Scientific Consensus
 

The Tragedy of the Risk-Perception Commons: Science Literacy and Climate Change

"They Saw a Protest": Cognitive Illiberalism and the Speech-Conduct Distinction 

Geoengineering and the Science Communication Environment: a Cross-Cultural Experiment

Fixing the Communications Failure

Why We Are Poles Apart on Climate Change

The Cognitively Illiberal State 

Who Fears the HPV Vaccine, Who Doesn't, and Why? An Experimental Study

Cultural Cognition of the Risks and Benefits of Nanotechnology

Whose Eyes Are You Going to Believe? An Empirical Examination of Scott v. Harris

Cultural Cognition and Public Policy

Culture, Cognition, and Consent: Who Perceives What, and Why, in "Acquaintance Rape" Cases

Culture and Identity-Protective Cognition: Explaining the White Male Effect

Fear of Democracy: A Cultural Evaluation of Sunstein on Risk

Cultural Cognition as a Conception of the Cultural Theory of Risk

« R^2 ("r squared") envy | Main | Handbook of Risk Theory »
Friday
Jan202012

Is cultural cognition a bummer? Part 1

Now & again I encounter the claim (often in lecture Q&A, but sometimes in print) that cultural cognition is wrong because it is too pessimistic. Basically, the argument goes like this:

Cultural cognition holds that individuals fit their risk perceptions to their group identities. That implies it is impossible to persuade anybody to change their minds on climate change and other issues—that even trying to reason with people is futile. I refuse to accept such a bleak picture. Instead, I think the real problem is [fill in blank—usually things like “science illiteracy,” “failure of scientists to admit uncertainty,” “bad science journalism,” “special interests distorting the truth”]

What’s wrong here?

Well, to start, there’s the self-imploding logical fallacy. It is a non sequitur to argue that because one doesn’t like the consequences of some empirical finding it must be wrong. And if what someone doesn’t like—and therefore insists “can’t be right”— is empirical research demonstrating the impact of a species of motivated reasoning, that just helps to prove the truth of exactly what such a person is denying.

Less amusingly and more disappointingly, the “too pessimistic, must be wrong“ fallacy suggests that the person responding this way is missing the bigger picture. In fact, he or she is missing two bigger pictures:

  • First, the “too pessimistic, so wrong” fallacy is looking only at half the empirical evidence: studies of cultural cognition show not only which communication strategies fail and why but also which ones avoid the identified mistake and thus work better.
     
  • Second, the “too pessimistic, so wrong” fallacy doesn’t recognize where cultural cognition fits into a larger debate about risk, rationality, and self-government. In fact, cultural cognition is an alternative—arguably the only psychologically realistic one—to an influential theory of risk perception that explicitly does assert the impossibility of reasoned democratic deliberation about the dangers we face and how to mitigate them.

I’m going to develop these points over the course of two posts.

  1. Cultural cognition theory doesn’t deny the possibility of reasoned engagement with evidence; it identifies how to remove a major impediment to it.

People have a stake in protecting the social status of their cultural groups and their own standing in them. As a result, they defensively resist—close their minds to consideration of—evidence of risk that is presented in a way that threatens their groups’ defining commitments.

But this process can be reversed. When information is presented in a way that affirms rather than threatens their group identities, people will engage open-mindedly with evidence that challenges their existing beliefs on issues associated with their cultural groups.

Not only have I and other cultural cognition researchers made this point (over & over; every time, in fact, we turn to normative implications of our work), we’ve presented empirical evidence to back it up.

Consider:

Identity-affirmative & narrative framing. The basic idea here is that if you want someone to consider the evidence that there's a problem, show the person that there are solutions that resonate with his or her cultural values.

E.g., Individualists values markets, commerce, and private orderings. They are thus motivated to resist information about climate change because they perceive (unconsciously) that such information, if credited, will warrant restrictions on commerce and industry.

But individualists love technology. For example, they are among the tiny fraction of the US population that knows what nanotechnology is, and when they learn about it they instantly think it's benefits are high & risks low. (When egalitarian communitarians—who readily credit climate change science— learn about nanotechnology, in contrast,  they instantly think its risks outweigh benefits; they adopt the same posture toward it that they adopt toward nuclear power. An aside, but only someone looking at half the picture could conclude that any position on climate change correlates with being either “pro-“ or “anti-science” generally).

So one way to make individualists react more open-mindedly to climate change science is to make it clear to them that more technology—and not just restrictions on it-- are among the potential responses to climate change risks. In one study, e.g., we found that individualists are more likely to credit information of the sort that appeared in the first IPCC report when they are told that greater use of nuclear power is one way to reduce reliance on green-house gas-emitting carbon fuel sources.

More recently, in a study we conducted on both US & UK samples, we found that making people aware of geoengineering as a possible solution to climate change reduced cultural polarization over the validity of scientific evidence on the consequences of climate change. The individuals whose values disposed them to dismiss a study showing that CO2 emissions dissipate much more slowly than previously thought became more willing to credit it when they had been given information about geoengineering & not just emission controls as a solution.

These are identity-affirmation framing experiments. But the idea of narrative is at work in this too. Michael Jones has done research on use of "narrative framing" -- basically, embedding information in culturally congenial narratives -- as a way to ease culturally motivated defensive resistance to climate change science. Great stuff.

Well, one compelling individualist narrative features the use of human ingenuity to help offset environmental limits on growth, wealth production, markets & the like. Only dumb species crash when they hit the top of Malthus's curve; smart humans, history shows, shift the curve.

That's the cultural meaning of both nuclear power and geoengineering. The contribution they might make to mitigating climate change risks makes it possible to embed evidence that climate change is happening and is dangerous in a story that affirms rather than threatens individualists’ values. Hey—if you really want to get them to perk their ears up, how about some really cool nanotechnology geoengieneering?

Identity vouching. If you want to get people to give open-minded consideration to evidence that threatens their values, it also helps to find a communicator who they recognize shares their outlook on life.

For evidence, consider a study we did on HPV-vaccine risk perceptions. In it we found that individuals with competing values have opposing cultural predispositions on this issue. When such people are shown scientific information on HPV-vaccine risks and benefits, moreover, they tend to become even more polarized as a result of their biased assessments of it.

But we also found that when the information is attributed to debating experts, the position people take depends heavily on the fit between their own values and the ones they perceive the experts to have.

This dynamic can aggravate polarization when people are bombarded with images that reinforce the view that the position they are predisposed to accept is espoused by experts who share their identities and denied by ones who hold opposing ones (consider climate change).

But it can also mitigate polarization: when individuals see evidence they are predisposed to reject being presented by someone whose values they perceive they share, they listen attentively to that evidence and are more likely to form views that are in accord with it.

Look: people aren’t stupid. They know they can’t resolve difficult empirical issues (on climate change, on HPV-vaccine risks, on nuclear power, on gun control, etc.) on their own, so they do the smart thing: they seek out the views of experts whom they trust to help them figure out what the evidence is. But the experts they are most likely to trust, not surprisingly, are the ones who share their values.

What makes me feel bleak about the prospects of reason isn’t anything we find in our studies; it is how often risk communicators fail to recruit culturally diverse messengers when they are trying to communicate sound science.

I refuse to accept that they can’t do better!

Part 2 here.

References:

Jones, M.D. & McBeth, M.K. A Narrative Policy Framework: Clear Enough to Be Wrong? Policy Studies Journal 38, 329-353 (2010).

Kahan, D. (2010). Fixing the Communications Failure. Nature, 463, 296-297.

Kahan, D. M., Braman, D., Cohen, G. L., Gastil, J., & Slovic, P. (2010). Who Fears the HPV Vaccine, Who Doesn't, and Why? An Experimental Study of the Mechanisms of Cultural Cognition. L. & Human Behavior, 34, 501-16.

Kahan, D. M., Braman, D., Slovic, P., Gastil, J., & Cohen, G. (2009). Cultural Cognition of the Risks and Benefits of Nanotechnology. Nature Nanotechnology, 4, 87-91.

Kahan, D. M., Slovic, P., Braman, D., & Gastil, J. (2006). Fear of Democracy: A Cultural Critique of Sunstein on Risk. Harvard Law Review, 119, 1071-1109.

Kahan, D.M. Cultural Cognition as a Conception of the Cultural Theory of Risk. in Handbook of Risk Theory: Epistemology, Decision Theory, Ethics and Social Implications of Risk (eds. Hillerbrand, R., Sandin, P., Roeser, S. & Peterson, M.) (Springer London, 2012).

Kahan D.M., Jenkins-Smith, J., Taranotola, T., Silva C., & Braman, D., Geoengineering and the Science Communication Environment: a Cross-cultural Study, CCP Working Paper No. 92, Jan. 9, 2012.

Sherman, D.K. & Cohen, G.L. Accepting threatening information: Self-affirmation and the reduction of defensive biases. Current Directions in Psychological Science 11, 119-123 (2002).

Sherman, D.K. & Cohen, G.L. The psychology of self-defense: Self-affirmation theory. in Advances in Experimental Social Psychology, Vol. 38 (ed. Zanna, M.P.) 183-242 (2006).

 

PrintView Printer Friendly Version

EmailEmail Article to Friend

Reader Comments (1)

Great Findings! Dr Kahan. Its been a pleasure reading this.

July 22, 2013 | Unregistered CommenterSteven Granger
Member Account Required
You must have a member account on this website in order to post comments. Log in to your account to enable posting.