This is the second part of a two-part series that recaps a talk I gave at a meeting of the National Academy of Science's really cool Public Interfaces of the Life Sciences Initiative.
The subject of the talk (slides here) was the public's understanding of what I called "decision relevant science" (DRS)--meaning science that's relevant to the decisions that ordinary members of the public make in the course of their everyday lives as consumers, as parents, as citizens, and the like.
Part 1 recounted a portion of the talk that I invited the audience to imagine came from a reality tv show called "Public comprehension of science--believe it or not!," a program, I said, dedicated to exploring oddities surrounding what the public knows about what's known to science. The concluding portion of the talk, which I'll reconstruct now, presented five serious points --or points that I at least intend to be serious and be taken seriously--about DRS, each of which in fact could be supported by one of the three "strange but true" stories featured in the just-concluded episode of "Public comprehension of science--believe it or not!"
I. Individuals must accept as known more DRS than they can ever possibly understand.
In the first story featured in the show, we learned that individuals belonging to that half of the US population that purports to "believe" in evolution are not more more likely to be able to give a cogent account of the "modern synthesis" (natural selection, genetic variance, and random mutation) than those belonging to the half that asserts "disbelief." In fact, very small proportions of either group can give such an account.
Thus, most of the people who quite properly accept evolution as "scientific fact" (including, I'm confident, the vast majority who view those who disbelive in it as pitifully ignorant) believe in something they don't understand.
That's actually not a problem, though. Indeed, it's a necessity!
The number of things known to science that it makes sense for a practical person to accept as true (that a GPS systems, exquisitely calibrated in line with Einstein's theory of special relativity, will reliably guide him to where he wants to go, for example) far exceed what such an individual could ever hope to comprehend in any meaningful way on his own. Life is too short.
Indeed, it will be a good deal shorter if before accepting that it makes sense not to smoke such a person insists on verifying for himself that smoking causes cancer -- or that before taking antibiotics that they do in fact kill disease-causing bacteria but do not -- as 50% of the U.S. population thinks-- "believe it or not!"--kill viruses.
II. Individuals acquire the insights of DRS by reliably recognizing who has it.
Yet it's okay, really, for a practical, intelligent person not to acquire the knowledge that antibiotics kill only bacteria and not viruses. He doesn't have to have an MD to get the benefits of what's known to medical science. He only has to know that if he gets sick, the person he should consult and whose advice he should follow is the doctor. She's the one who knows what science knows there.
That's how, in general, individuals get the benefit of DRS--not by understanding it themselves but by reliably recognizing who knows what about what because they know it in the way that science counts as knowing.
Why not go to a faith healer or a shaman when one has a sore throat -- or a cancerous legion or persistent hacking cough? Actually, some very tiny fraction of the population does. But that underscores only that there really are in fact people out there whose "knowledge" on matters of consequence to ordinary people's lives are not ones that science would recognize and that precious few people (in a modern liberal market society) treat them as reliable sources of knowledge.
Ordinary people reliably make use of all manner of DRS -- medical science is only one of many kinds -- not because they are experts on all the matters to which DRS speaks but because they are themselves experts at discerning who knows what's known to science.
III. Public conflict over DRS is a recognition problem, not a comprehension problem.
Yet ordinary members of the public do disagree--often quite spectacularly--about certain elements of DRS. These conflicts are not a consequence of defects in public comprehension of science, however. They are a product of the the failure of ordinary members of the public to converge in the exercise of their normal and normally reliable expert ability to recognize who knows what about what.
Believe it or not, one can work out this conclusion logically on the basis of information related in the "Public Comprehension of Science--Believe it or Not!" show.
Members of the public, we learned, are (1) divided on climate science and (2) don't understand it (indeed, the ones who "believe" in it, like the ones who believe in evolution, generally don't have a meaningful understanding of what they believe).
But (2) doesn't cause (1). If it did, we'd expect members of the public to be divided on zillions of additional forms of DRS on which they in fact are not. Like the efficacy of antibiotics, which half the population believes (mistakenly) kill viruses.
Or pasteurized milk. No genuine cultural conflict over that, at least in the US. And the reason isn't that people have a better grasp of biology than they do of climate science. Rather it's that there, as with the health benefits of antibiotics, they are reaching the same conclusion when they exercise their rational capacity to recognize who knows what science knows on this matter.
Indeed, those of you who are leaping out your seats with excitement to point out the freaky outlier enclaves in which there is a dispute about pasteurization of milk in the US, save yourself the effort! What makes the spectacle of such conflicts newsworthy is precisely that the advocates of the health benefits of "raw milk" are people whom the media knows the vast run of ordinary people (the news media consumers) will regard as fascinatingly weird.
Because people acquire the insights of DRS by reliably recognizing who knows what science knows, conflicts over DRS must be ones in which they disagree about what those who know what science knows know.
This conclusion has been empirically verified time and again.
On matters like the risks of climate change, the safety of nuclear power waste disposal, the effects of gun control on crime, and the efficacy and side effects of the HPV vaccine, no one (or no one of consequence, if we are trying to understand public conflict rather as opposed to circus sideshows) is saying "screw the scientists--who cares what they think!"
Rather, everyone is arguing about what "expert scientists" really believe. Using their normal and normally reliable rational powers of recognition, those on both sides are concluding that the view that their side accepts is the one consistent with "scientific consensus."
What distinguishes the small number issues on which we see cultural polarization over DRS from the vast number of ones in which we don't has nothing to do with how much science the public comprehends. Rather, it has everything to do with the peculiar tendency of the former to evade the common capacity enjoyed by culturally diverse citizens to recognize who knows what it is known to science.
IV. The recognition problem reflects a polluted science communication environment.
A feature that these peculiar, recognition-defying issues share is their entanglement in antagonistic cultural meanings.
For the most part, ordinary people exercise their capacity to recognize who knows what about what by consulting other people "like them." They are better able to "read" people who share their particular outlooks on life; they enjoy interacting with them more than interacting with people who subscribe to significantly different understandings of the best way to live, and are less likely to get into squabbles with them as they exchange information. "Cultural communities" -- networks of people connected by intense, emotional and like affinities -- are the natural environment, then, for the exercise of ordinary citizen's rational recognition capacity.
Ordinarily, too, these communities, while plural and diverse, point their respective members in the same direction. Any such community that consistently misled its members about DRS wouldn't last long given how critical DRS is to the flourishing -- indeed, simple survival -- of their members.
But every now and again, for reasons that are not a complete mystery but that are still far from adequately understood, some fact -- like whether the earth is heating up -- comes to be understood as a kind of marker of cultural identity.
The position one holds on a fact like that will then be experienced by people -- and seen by others (the two are related, of course) -- as a badge of membership in, and loyalty to, one or another cultural group.
At that point, reasonable people become unreasonably resistant to changing their minds--and for reasons that, in a sad and tragic sense, are perfectly rational.
The stake they have in maintaining group-convergent beliefs will usually be much bigger than any they might have in being "right." Making a "mistake" on the science of climate change, e.g., doesn't affect the risk that any ordinary member of the public or any person or any other thing she cares about faces: she just doesn't matter enough as a a consumer, a voter, a public deliberator etc. to make a difference. But if she forms a view that is out of line on it from the point of view of those who share her cultural allegiances, then she is likely to suffer tremendous costs--psychic, emotional, and material--given the function that positions on climate change perform in identifying to members of such groups who belongs to it and can be trusted.
These antagonistic meanings, then, can be viewed as a form of pollution in the science communication environment. They enfeeble the reliable operation of the normally reliable faculties of recognition that ordinarily members of the public use to discern DRS.
People overwhelmingly accept that doctors and public health officials are the authorities to turn to to have access to the health benefits of what's known to science, and ordinarily have little difficulty in discerning what those experts believe and are counseling them to do. But when facts relating to medical treatments become suffused with culturally antagonistic meanings, ordinary members of the public are not able to figure out what such experts actually know.
The US public isn't divided over the risks and benefits of mandatory vaccination of children for Hepatitis B, a sexually transmitted disease that causes a deadly form of cancer. Consistent with the recommendation of the CDC and pediatricians, well over 90% of children get the HBV vaccination every year.
Americans are culturally divided, however, over whether children should get the HPV vaccine, which likewise confers immunity to a sexually transmitted disease (the human papillomavirus) that causes a deadly form of cancer. For reasons having to do with the ill-advised process by which it as introduced into the US, the HPV vaccine became suffused with antagonistic cultural meanings--ones relating to gender norms, sexuality, religion, and parental sovereignty.
Parents who want to follow the advise of public health experts can't discern what their position is on the HPV vaccine, even though it is exactly he same as it is on the HBV vaccine. Experimental studies have confirmed that their exposure to the antagonistic meanings surrounding the former make them unable to form confident judgments about what experts believe about the risks and benefits of the HPV vaccine, even though CDC and pediatricians support it to the same extent as they do the HBV vaccine and for the same reasons.
The antagonistic cultural meanings that suffuse issues like climate change and the HPV vaccine confront ordinary people with an extraordinary conflict between knowing what's known to science and being who they are. This toxic environment poses a singular threat to their capacity to make use of DRS to live happy and healthy lives.
V. Protecting the science communication environment from contamination is a critical aim of the science of science communication.
Repelling that threat demands the development of a systematic societal capacity to protect the science communication environment form the pollution of antagonistic cultural meanings.
Technologies for abating the dangers human beings face are not born with antagonistic cultural meanings. They acquire them through historical contingencies of myriad forms. Strategic behavior plays a role; but sheer accident and misadventure also contribute.
Understanding the dynamics that govern this pathology is a central aim of the science of science communication. We can learn how to anticipate and avoid them in connection with emerging forms of practical science, such as nanotechnology and synthetic biology. And we can perfect techniques for removing antagonistic meanings in the remaining instances in which intelligent, self-conscious protective action fails to prevent their release into the science communication environment.
The capacity to reliably recognize what is collectively known is not some form of substitute for attainment of scientific knowledge. It is in fact a condition of it within the practice of science and outside of it.
In discerning DRS, the public is in fact exercising the most elemental form of human rationality.
Securing the political and social conditions in which that faculty can reliably function is the most important aim of the science of science communication.