Is cultural cognition a bummer? Part 2
Wednesday, January 25, 2012 at 1:16AM
Dan Kahan

This is the second of two posts addressing “too pessimistic, so wrong”: the proposition that findings relating to cultural cognition should be resisted because they imply that it’s “futile” to reason with people.

In part one, I showed that “too pessimistic, so wrong”—in addition to being simultaneously fallacious and self-refuting (that’s actually pretty cool, if you think about it)—reflects a truncated familiarity with cultural cognition research. Studies of cultural cognition examine not only how it can interfere with open-minded consideration of scientific information but also what can be done to counteract this effect and generate open-minded evaluation of evidence that is critical of one’s existing beliefs.

Now I’ll identify another thing that “too pessimistic, so wrong” doesn't get: the contours of the contemporary normative and political debate over risk regulation and democracy.

2.  "Too pessimistic, so wrong" is innocent of the real debate about reason and risk regulation.

Those who make the “too pessimistic, must be wrong” argument are partisans of reason (nothing wrong with that). But ironically, by “refusing to accept” cultural cognition, these commentators are actually throwing away one of the few psychologically realistic programs for harmonizing self-government with scientifically enlightened regulation of risk.

The dominant view of risk regulation in social psychology, behavioral economics, and legal scholarship asserts that members of the public are too irrational to figure out what dangers society faces and how effectively to abate them. They don't know enough science; they have to use emotional heuristic substitutes for technical reasoning. They are dumb, dumb, dumb.

Well, if that is right, democracy is sunk. We can't make the median citizen into a climate scientist or a nuclear physicist. So either we govern ourselves and die from our stupidity; or, as many influential commentators in the academy (one day) and government (the next) argue, we hand over power to super smart politically insulated experts to protect us from myriad dangers.

Cultural cognition is an alternative to this position. It suggests a different diagnosis of the science communication crisis, and also a feasible cure that makes enlightened self-government a psychologically realistic prospect.

Cultural cognition implies that political conflicts over policy-relevant science occur when the questions of fact to which that evidence speaks become infused with antagonistic cultural meanings.

This is a pathological state—both in the sense that it is inimical to societal well-being and in the sense that it is unusual, not the norm, rare.  

The problem, according to the cultural cognition diagnosis, is not that people lack reason. It is that the reasoning capacity that normally helps them to converge on the best available information at society’s disposal is being disabled by a distinctive pathology in science communication.

The number of scientific insights that make our lives better and that don’t culturally polarize us is orders of magnitude greater than the ones that do. There’s not a “culture war” over going to doctors when we are sick and following their advice to take antibiotics when they figure out we have infections. Individualists aren’t throttling egalitarians over whether it makes sense to pasteurize milk or whether high-voltage power lines are causing children to die of leukemia.

People (the vast majority of them) form the right beliefs on these and countless issues, moreover, not because they “understand the science” involved but because they are enmeshed in networks of trust and authority that certify whom to believe about what.

For sure, people with different cultural identities don’t rely on the same certification networks. But in the vast run of cases, those distinct cultural certifiers do converge on the best available information. Cultural communities that didn’t possess mechanisms for enabling their members to recognize the best information—ones that consistently made them distrust those who do know something about how the world works and trust those who don’t—just wouldn’t last very long: their adherents would end up dead.

Rational democratic deliberations about policy-relevant science, then, doesn’t require that people become experts on risk. It requires only that our society take the steps necessary to protect its science communication environment from a distinctive pathology that enfeebles ordinary citizens from using their (ordinarily) reliable ability to discern what it is that experts know.

“Only” that? But how?

Well, that’s something cultural cognition addresses too — in the studies that “too pessimistic, so wrong” ignores and that I described in part one.

Don’t get me wrong: the program to devise strategies for protecting the science communication enviornment has a long way to go.

But we won’t even make one step toward perfecting the science of science communication if we resolve to “resist” evidence because we find its implications to be a bummer.

Reference: 

Kahan, D.M., Slovic, P., Braman, D. & Gastil, J. Fear of Democracy: A Cultural Critique of Sunstein on Risk. Harvard Law Review 119, 1071-1109 (2006).

 

 

Article originally appeared on cultural cognition project (http://www.culturalcognition.net/).
See website for complete article licensing information.