0. What is this “science of science communication”? The science of science communication can be understood as a remedy for two fallacies.
The first is res ipsa locquitur (“the thing speaks for itself”): the validity of valid science is manifest, making scientific study of it neither interesting nor necessary.
The second is ab uno disce omnes (“from one, learn all”): the scientific knowledge necessary to enable a doctor to meaningfully advise a patient on a complicated treatment decision is the same as the knowledge necessary to enable a science journalist to edify a curious member of the public, an empirical researcher to advise a policymaker, an educator to teach a high school student the theory of evolution, etc.
My remarks are mainly directed at the ab uno fallacy. I want to describe the distinctive species of SSC that is most likely to evade comprehension if one makes the mistake of thinking it’s only one thing. It is also the one that is arguably most important for the well-being of democratic society.
The aim of this species of SSC is to protect the science communication environment.
1. The puzzle of cultural polarization over risk
Members of the public in the U.S. are highly divided on all manner of fact relating to climate change. So are members of the public in many other nations, including the UK.
There are other risks—from GM foods to nuclear power to gun ownership to vaccination against infection by HPV or other contagious diseases—that fracture the members of some of these socieites but not others.
Not to be struck by the puzzling nature of this phenomenon is to admit a deficit in curiosity. It’s not surprising at all that people with different values would disagree about what to do about a societal risk like climate change or gun possession. But there’s nothing in how much one weights equality relative to wealth, or security relative to liberty, that determines whether the earth is heating up as a result of human activity or whether permitting citizens to carry concealed handguns in public deters violent assaults.
It’s not surprising either that ordinary members of the public would disagree with one another on facts the nature of which turns on evidence as technically complex as that surrounding climate change, nuclear power, or gun control.
But if complexity were the source of the problem, we’d expect disagreement to be randomly distributed with respect to cultural and political values, and to abate as individuals become progressively more comprehending of science.
Not so: on the contrary, the most science comprehending members of the public are the most culturally polarized! (At least in the U.S.; I’m not aware of resarch of this sort with non-US samples & would be grateful to anyone who fills in this gap in my knowledge, if it is one).
What’s the explanation for such a peculiar distribution of beliefs—and on facts that not only admit of investigation by empirical means but that have in fact been investigated by expert empirical methods?
2. The cultural cognition thesis
The answer (or certainly a very large part of it) is cultural cognition.
Cultural cognition is a species of motivated reasoning, which refers to the tendency of people to conform their assessment of all manner of information (empirical data, logical arguments, brute sense impressions) to some goal or interest independent of forming a correct judgment.
The cultural cognition thesis holds that people can be expected to conform their perceptions of risk and like facts to the stake they have in maintaining their connection to and status within important affinity groups.
The nature of these commitments can be measured by various means, including right-left political outlooks, but in our research we ordinarily do so with scales patterned on the “worldview” dimensions associated with Mary Douglas’s “group-grid” framework.
3. Some evidence
Studies conducted by myself and my collaborators have generated various forms of evidence in support of the cultural cognition thesis—and against rival theories that are often used to explain political conflict over societal risks.
a. Cultural cognition of scientific consensus. In one study, we performed an experiment that showed how cultural cognition influenced formation of public perceptions of what expert scientists believe. The results showed that how readily individuals of diverse cultural outlooks identified a scientist as an “expert” on climate change, nuclear power, or gun control depended on whether that scientist was depicted as espousing a position consistent with the one that prevails in the individuals’ cultural groups.
If individuals selectively credit and dismiss evidence of “expert” opinion in this fashion, they will become culturally polarized over what scientific consensus is in disputed issues. And, indeed, the study found that in all cases the vast majority of subjects perceived that “scientific consensus” on the relevant issue—climate change, nuclear power, and gun possession—was consistent with the one that prevailed in their cultural group.
The study findings were not only consistent with the cultural cognition thesis, but also inconsistent with two alternatives. One of these attributes political conflict over societal risks to one or another group’s hostility to science. In fact, no group subscribed to a position that it perceived to be contrary to prevailing scientific opinion.
The second alternative explanation sees one or another group as more attuned to scientific consensus than its rivals. But in fact, all groups were equally likely to view as the “consensus” among expert scientists the position contrary to the one endorsed as the “consensus” position by the U.S. National Academy of Science.
b . “Feeling” the heat—and the hurricanes, floods, tornados etc. A common theme—indeed, the dominant for commentators who derive their explanations from syntheses of general literature rather than by original empirical research—attributes popular conflict over climate change to the public’s overreliance on heuristic, “system 1” as opposed to more reflective, dispassionate “system 2” information processing.
Those who advance this thesis typically predict that individuals will begin to revise upward their perception of the seriousness of climate change risks as they experience climate-change impacts first hand. “Feeling” climate change, it is argued, will create the emotionally vivid impression that those who form their risk perceptions heuristically will require to start taking climate change seriously.
This prediction is also contrary to the evidence.
It’s true that individuals’ perceptions of climate-change risk correlate with their perception that temperatures in their area have been increasing in recent years. But their perception of recent local temperatures are not predicted by what those temperatures have actually been.
Rather, they are predicted by their cultural outlooks, suggesting that individuals selectively attend to or recall weather extremes in patterns that reflect their groups’ position on climate change.
Nor do individuals appear to uniformly revise their perception of climate-change risks as they experience significant extreme-weather hardships. A CCP study of residents of southeast Florida found that the number of times a person had been forced to evacuate his or her residence, had been deprived of access to drinking water, had suffered property damage, etc. as a result of extreme weather or flooding had a very modest positive impact on the perceived risk of climate change for egalitarian communitarians—the individuals most culturally predisposed to credit evidence of climate change—but none on hierarchical individualists—those most culturally predisposed to dismiss such evidence.
In other words, people don’t “believe” in climate change when they “see” it; they see it only when they already believe it.
Cultural cognition predicts this—although so does elementary logic, since individuals who experience such events can’t “see” or “feel” the cause of them. What they see extreme weather as evidence of (climate change, tolerance of gay marriage, nothing in particular, etc.) necessarily depends on their assent to some account of how the world works that they are not themselves in a position to verify. And that’s where cultural cognition comes in.
c. Motivated system 2 reasoning. The popular “thinking fast, thinking slow” account of climate-change controversy also implies that the members of the public most disposed to use reflective “system 2” reasoning can be expected to form perceptions of climate risk more in line with scientific consensus.
Again, the evidence does not bear this claim out. In fact, they are the ones who are the most polarized.
That’s what the cultural cognition thesis tells us to expect. Those who possess the skills and habits of mind necessary to critically evaluate complex arguments and data have more tools at their disposal to fit their assessments of evidence to the beliefs that are predominant in their identity-defining groups.
4. A polluted science communication environment
The spectacle of intense, persistent political conflict can easily distract us from the state of public opinion on the vast run of facts addressed by decision-relevant science. The number of risk issues that divide members of the public along cultural lines is infinitesimal in relation to the number that don’t but could. There’s no meaningful level of political contestation over the health risks of unpasteurized milk, medical x-rays, high-power transmission lines, fluoridated water, etc. On these issues, moreover, culturally diverse individuals do tend to converge on the best-available evidence as their capacity for science comprehension increases.
The reason that these issues do not provoke controversy, moreover, is not that individuals understand the scientific evidence on the relevant risks more completely than they understand the evidence on climate change or nuclear power or the HPV vaccine or gun control.
Individuals (including scientists) align themselves appropriately with a body of decision-relevant science much vaster than they could be expected to comprehend or verify for themselves. They achieve this feat by the exercise of a reliable faculty for recognizing insights that originate in the methods that science uses to discern the truth.
Their everyday interactions with others who share their cultural worldviews are the natural domain for the use of this faculty. Individuals spend most of their time with others who share their values; they can exchange information with them readily, without the friction that might attend interactions with individuals whose fundamental outlooks on life differ fundamentally from their own; and they are more able to read those with whom they share defining commitments, and thus to distinguish those of their number who know what they are talking about from those who don’t.
All the various affinity groups within which individuals exercise their knowledge-recognition faculties are amply stocked with people high in science comprehension, and all fully equipped with high-functioning processes for transmitting what their members collectively know of what’s become collectively known through science. So while admittedly (even regrettably) insular, the ordinary interaction of ordinary individuals with those who share their cultural worldviews generally succeeds in aligning individuals’ beliefs with the best available evidence relevant to the decisions they must make in their personal and collective lives.
This process breaks down only in the rare situation when positions on particular issues become entangled in antagonistic cultural meanings, effectively transforming them into badges of membership in and loyalty to one or another competing group. At that point, the stake that ordinary individuals have in forming and persisting in beliefs consistent with others in their group will dominate the stake they have in forming beliefs that reflect what’s known to science: what she personally believes—right or wrong—about climate change, nuclear power, and other societal risks won’t have any impact on the level of risk she or anyone else faces; the formation of a belief at odds with the one that predominates in her group, however, threatens to estrange her from those on whom her welfare—material and psychic—depends.
These antagonistic cultural meanings are a form of pollution in the science communication environment. They literally disable the ordinarily reliable faculty ordinary individuals rely on to discern what’s known by science.
Engaging information in a manner that reflects their individual interest in forming and persisting in group-convergent beliefs, diverse citizens are less likely to converge on the best available evidence relevant to the health and well-being of them all.
The factual presuppositions of policy choices having become symbols of opposing visions of the best life, debates over risk regulation become the occasion for illiberal forms of status competition between competing cultural groups.
This polluted science communication environment is toxic for liberal democracy.
5. The science of #scicomm environment protection
The entanglement of positions on societal risk in culturally antagonistic meanings is not a consequence of immutable natural laws or historical processes. Specific, identifiable events—ones originating in accident and misadventure as often as strategic behavior—steer putative risk sources down this toxic path.
By empirically investigating why a putative risk source (e.g., mad cow disease or GM foods) took this route in one nation but not another, or why two comparable risk sources (the HPV vaccine and the HBV vaccine) travelled different paths in a single nation (the U.S.), the science of science communication enables us to understand the influences that transform policy-relevant facts into divisive markers of group identity.
The same methods, moreover, can be used to control such influences. They can be used to forecast the likely development of them in time to enable actors in government and civil society alike can act to avoid their occurrence. They can also be used to formulate and test strategies for disentangling positions from antagonistic meanings where such preventive measures fail.
The vulnerability of risk regulation to cultural contestation is not a consequence of one or another groups’ hostility to science, of citizens’ “bounded rationality,” or of some inherent drive or appetite on the part of competing groups to impose a sectarian orthodoxy on society.
It is the predictable but manageable outgrowth of the same conditions of political liberty and social pluralism that make liberal democracy distinctively congenial to the advance of scientific knowledge.
By using the hallmark methods of science to protect the science communication environment, we can assure our enjoyment of the unprecedented knowledge and freedom that are the hallmarks of liberal democracy.