follow CCP

Recent blog entries
popular papers

Science Curiosity and Political Information Processing

What Is the "Science of Science Communication"?

Climate-Science Communication and the Measurement Problem

Ideology, Motivated Cognition, and Cognitive Reflection: An Experimental Study

'Ideology' or 'Situation Sense'? An Experimental Investigation of Motivated Reasoning and Professional Judgment

A Risky Science Communication Environment for Vaccines

Motivated Numeracy and Enlightened Self-Government

Ideology, Motivated Cognition, and Cognitive Reflection: An Experimental Study

Making Climate Science Communication Evidence-based—All the Way Down 

Neutral Principles, Motivated Cognition, and Some Problems for Constitutional Law 

Cultural Cognition of Scientific Consensus
 

The Tragedy of the Risk-Perception Commons: Science Literacy and Climate Change

"They Saw a Protest": Cognitive Illiberalism and the Speech-Conduct Distinction 

Geoengineering and the Science Communication Environment: a Cross-Cultural Experiment

Fixing the Communications Failure

Why We Are Poles Apart on Climate Change

The Cognitively Illiberal State 

Who Fears the HPV Vaccine, Who Doesn't, and Why? An Experimental Study

Cultural Cognition of the Risks and Benefits of Nanotechnology

Whose Eyes Are You Going to Believe? An Empirical Examination of Scott v. Harris

Cultural Cognition and Public Policy

Culture, Cognition, and Consent: Who Perceives What, and Why, in "Acquaintance Rape" Cases

Culture and Identity-Protective Cognition: Explaining the White Male Effect

Fear of Democracy: A Cultural Evaluation of Sunstein on Risk

Cultural Cognition as a Conception of the Cultural Theory of Risk

« Cultural cognition and "in group" dynamics: informational vs. social effects | Main | Motivated Numeracy (new paper)! »
Monday
Sep092013

The quality of the science communication environment and the vitality of reason

The Motivated Numeracy and Enlightened Self-Government working paper has apparently landed in the middle of an odd, ill-formed debate over the "knowledge deficit theory" and its relevance to climate-science communication. I'm not sure, actually, what that debate is about or who is involved.  But I do know that any discussion framed around the question "Is the knowledge-deficit theory valid?" is too simple to generate insight. There are indeed serious, formidable contending accounts of the the nature of the "science communication problem"--the failure of citizens to converge on the best available evidence on the dangers they face and the efficacy of measures to abate them.  The antagonists in any "knowledge-deficit debate" will at best be stick-figure representations of these positions. 

Below is an excerpt from the concluding sections of the MNESG paper. It reflects how I see the study findings as contributing to the position I find most compelling in the scholarly discussion most meaningfully engaged with the science communication problem. The excerpt can't by itself supply a full account of the nature of the contending positions and the evidence on which they rest (none is wholly without support). But for those who are motivated to engage the genuine and genuinely difficult questions involved, the excerpt might help to identify for them paths of investigation that will lead them to locations much more edifying than the ones in which the issue of "whether the knowledge deficit theory is valid" is thought to be a matter worthy of discussion.

5.2. Ideologically motivated cognition and dual process reasoning generally

The ICT hypothesis corroborated by the experiment in this paper conceptualizes Numeracy as a disposition to engage in deliberate, effortful System 2 reasoning as applied to quantitative information. The results of the experiment thus help to deepen insight into the ongoing exploration of how ideologically motivated reasoning interacts with System 2 information processing generally.

As suggested, dual process reasoning theories typically posit two forms of information processing: a “fast, associative” one “based on low-effort heuristics”, and a “slow, rule based” one that relies on “high-effort systematic reasoning” (Chaiken & Trope 1999, p. ix). Some researchers have assumed (not unreasonably) that ideologically motivated cognition—the tendency selectively to credit or discredit information in patterns that gratify one’s political or cultural predispositions—reflects over-reliance on the heuristic forms of information processing associated with heuristic-driven, System 1 style of information processing (e.g., Lodge & Taber 2013; Marx et al. 2007; Westen, Blagov, Harenski, Kilts, & Hamann, 2006; Weber & Stern 2011; Sunstein 2006).

There is mounting evidence that this assumption is incorrect. It includes observational studies that demonstrate that science literacy, numeracy, and education (Kahan, Peters, Wittlin, Slovic, Ouellette, Braman & Mandel 2012; Hamilton 2012; Hamilton 2011)—all of which it is plausible to see as elements or outgrowths of the critical reasoning capacities associated with System 2 information processing—are associated with more, not less, political division of the kind one would expect if individuals were engaged in motivated reasoning.

Experimental evidence points in the same direction. Individuals who score higher on the Cognitive Reflection Test, for example, have shown an even stronger tendency than ones who score lower to credit evidence selectively in patterns that affirm their political outlooks (Kahan 2013). The evidence being assessed in that study was nonquantitative but involved a degree of complexity that was likely to obscure its ideological implications from subjects inclined to engage the information in a casual or heuristic fashion. The greater polarization of subjects who scored highest on the CRT was consistent with the inference that individuals more disposed to engage systematically with information would be more likely to discern the political significance of it and would use their critical reasoning capacities selectively to affirm or reject it conditional on its congeniality to their political outlooks.

The experimental results we report in this paper display the same interaction between motivated cognition and System 2 information processing. Numeracy predicts how likely individuals are to resort to more systematic as opposed to heuristic engagement with quantitative information essential to valid causal inference. The results in the gun-ban conditions suggest that high Numeracy subjects made use of this System 2 reasoning capacity selectively in a pattern consistent their motivation to form a politically congenial interpretation of the results of the gun-ban experiment.  This outcome is consistent with that of scholars who see both systematic (or System 2) and heuristic (System 1) reasoning as vulnerable to motivated cognition (Cohen 2003; Giner-Sorolla & Chaiken 1997;  Chen, Duckworth & Chaiken 1999).

These findings also bear on whether ideologically motivated cognition is usefully described as a manifestation of “bounded rationality.” Cognitive biases associated with System 1 reasoning are typically characterized that way on the ground that they result from over-reliance on heuristic patterns of information processing that reflect generally adaptive but still demonstrably inferior substitutes for the more effortful and more reliable type of information processing associated with System 2 reasoning (e.g., Kahneman 2003; Jolls, Sunstein & Thaler 1998).

We submit that a form of information processing cannot reliably be identified as “irrational,” “subrational,” “boundedly rational” or the like independent of what an individuals’ aims are in making use of information. It is perfectly rational, from an individual-welfare perspective, for individuals to engage decision-relevant science in a manner that promotes culturally or politically congenial beliefs. Making a mistake about the best-available evidence on an issue like climate change, nuclear waste disposal, or gun control will not increase the risk an ordinary member of the public faces, while forming a belief at odds with the one that predominates on it within important affinity groups of which they are members could expose him or her to an array of highly unpleasant consequences (Kahan 2012). Forms of information processing that reliably promote the stake individuals have in conveying their commitment to identity-defining groups can thus be viewed as manifesting what Anderson (1993) and others (Cohen 2003; Akerlof and Kranton 2000; Hillman 2010; Lessig 1995) have described as expressive rationality.

If ideologically motivated reasoning is expressively rational, then we should expect those individuals who display the highest reasoning capacities to be the ones most powerfully impelled to engage in it (Kahan et al. 2012). This study now joins the rank of a growing list of others that fit this expectation and that thus supports the interpretation that ideologically motivated reasoning is not a form of bounded rationality but instead a sign of how it becomes rational for otherwise intelligent people to use their critical faculties when they find themselves in the unenviable situation of having to choose between crediting the best available evidence or simply being who they are.

6. Conclusion: Protecting the “science-communication environment”

To conclude that ideologically motivated reasoning is expressively rational obviously does not imply that it is socially or morally desirable (Lessig 1995). Indeed, the implicit conflation of individual rationality and collective wellbeing has long been recognized to be a recipe for confusion, one that not only distorts inquiry into the mechanisms of individual decisionmaking but also impedes the identification of social institutions that remove any conflict between those mechanisms and attainment of the public good (Olson 1965). Accounts that misunderstand the expressive rationality of ideologically motivated cognition are unlikely to generate reliable insights into strategies for counteracting the particular threat that persistent political conflict over decision-relevant science poses to enlightened democratic policymaking.

Commentators who subscribe to what we have called the Science Comprehension Thesis typically propose one of two courses of action. The first is to strengthen science education and the teaching of critical reasoning skills, in order better to equip the public for the cognitive demands of democratic citizenship in a society where technological risk is becoming an increasingly important focus of public policymaking (Miller & Pardo 2000). The second is to dramatically shrink the scope of the public’s role in government by transferring responsibility for risk regulation and other forms of science-informed policymaking to politically insulated expert regulators (Breyer 1993). This is the program advocated by commentators who believe that the public’s overreliance on heuristic-driven forms of reasoning is too elemental to human psychology be corrected by any form of education (Sunstein 2005).

Because it rejects the empirical premise of the Science Comprehension Thesis, the Identity-protective Cognition Thesis takes issue with both of these prescriptions. The reason that citizens remain divided over risks in the face of compelling and widely accessible scientific evidence, this account suggest, is not that that they are insufficiently rational; it is that the that they are too rational in extracting from information on these issues the evidence that matters most for them in their everyday lives. In an environment in which positions on particular policy-relevant facts become widely understood as symbols of individuals’ membership in and loyalty to opposing cultural groups, it will promote people’s individual interests to attend to evidence about those facts in a manner that reliably conforms their beliefs to the ones that predominate in the groups they are members of. Indeed, the tendency to process information in this fashion will be strongest among individuals who display the reasoning capacities most strongly associated with science comprehension.

Thus, improving public understanding of science and propagating critical reasoning skills—while immensely important, both intrinsically and practically (Dewey 1910)—cannot be expected to dissipate persistent public conflict over decision-relevant science. Only removing the source of the motivation to process scientific evidence in an identity-protective fashion can. The conditions that generate symbolic associations between positions on risk and like facts, on the one hand, and cultural identities, on the other, must be neutralized in order to assure that citizens make use of their capacity for science comprehension.[1]

In a deliberative environment protected from the entanglement of cultural meanings and policy-relevant facts, moreover, there is little reason to assume that ordinary citizens will be unable to make an intelligent contribution to public policymaking. The amount of decision-relevant science that individuals reliably make use of in their everyday lives far exceeds what any of them (even scientists, particularly when acting outside of the domain of their particular specialty) are capable of understanding on an expert level. They are able to accomplish this feat because they are experts at something else: identifying who knows what about what (Keil 2010), a form of rational processing of information that features consulting others whose basic outlooks individuals share and whose knowledge and insights they can therefore reliably gauge (Kahan, Braman, Cohen, Gastil & Slovic 2010).

These normal and normally reliable processes of knowledge transmission break down when risk or like facts are transformed (whether through strategic calculation or misadventure and accident) into divisive symbols of cultural identity. The solution to this problem is not—or certainly not necessarily!—to divest citizens of the power to contribute to the formation of public policy. It is to adopt measures that effectively shield decision-relevant science from the influences that generate this reason-disabling state (Kahan et al. 2006).

Just as individual well-being depends on the quality of the natural environment, so the collective welfare of democracy depends on the quality of a science communication environment hospitable to the exercise of the ordinarily reliable reasoning faculties that ordinary citizens use to discern what is collectively known. Identifying strategies for protecting the science communication environment from antagonistic cultural meanings—and for decontaminating it when such protective measures fail—is the most critical contribution that decision science can make to the practice of democratic government.


[1] We would add, however, that we do not believe that the results of this or any other study we know of rule out the existence of cognitive dispositions that do effectively mitigate the tendency to display ideologically motivated reasoning. Research on the existence of such dispositions is ongoing and important (Baron 1995; Lavine, Johnston & Steenbergen, 2012). Existing research, however, suggests that the incidence of any such disposition in the general population is small and is distinct from the forms of critical reasoning disposition—ones associated with constructs such as science literacy, cognitive reflection, and numeracy—that are otherwise indispensable to science comprehension. In addition, we submit that the best current understanding of the study of science communication indicates that the low incidence of this capacity, if it exists, is not the source of persistent conflict over decision-relevant science. Individuals endowed with perfectly ordinary capacities for comprehending science can be expected reliably to use them to identify the best available scientific evidence so long as risks and like policy-relevant facts are shielded from antagonistic cultural meanings.

PrintView Printer Friendly Version

EmailEmail Article to Friend

Reader Comments (11)

Dan - it's time for one of my off-topic/random thoughts comments:

First, I thought this to be an interesting context for examining the influence of motivated reasoning:

http://www.nbcnews.com/health/new-salvo-mammogram-wars-says-young-women-should-be-screened-8C11098530

It is fascinating to me how scientists can be diametrically opposed, on multiple levels, on how to interpret the data on mammograms.

Second, I was thinking this morning about how I would feel about identifying as a climate change "skeptic." What I thought about is how I wouldn't be particularly concerned about the reactions of my friends or associates. They might think I'm kind of crazy on the issue of climate change, but they wouldn't disassociate from me, and I highly doubt that their opinions about me generally would be significantly altered: I wouldn't be any more (or less) of a kook than I am already in their eyes.

Instead, I think what would be more salient would be the difficulty I'd have in reconciling the views of my friends family - as I would have a hard time accepting the opinion that they were foolish, or worse, willfully promoting fraudulent science to achieve a statist fantasy.

I point this out because I often read speculation related to "motivated reasoning" that suggests a causal attribution stemming from a fear of rejection from one's group. I think that often, if not more often, the causal motivation is more likely to be reversed?

September 9, 2013 | Unregistered CommenterJoshua

The first is to strengthen science education and the teaching of critical reasoning skills, in order better to equip the public for the cognitive demands of democratic citizenship in a society where technological risk is becoming an increasingly important focus of public policymaking (Miller & Pardo 2000).

OMG. This reminds me! When someone fails to converge with my opinions on climate change, I like to ask him/her some fairly easy questions just to get a rough idea of his/her scientific literacy. In every case—without exception—it turns out that he or she has forgotten (or never been taught in the first place) the principle that in science, consensus is not a form of evidence. (I believe other skeptics have observed the same correlation too.) It's appalling how many adults don't remember such an important rule! My working hypothesis is that ignorance of that particular law of scientific reasoning is a major risk factor—or actually, a necessary precondition—for being taken in by climate misinfomers and propagandists. If only our schools did a better job of drumming this into our kids, we wouldn't be seeing such tragic rates of climate-alarmism among adults.

September 9, 2013 | Unregistered CommenterBrad Keyes

Looking at the skin cream experiment I'm curious about your claim that the correct answers are not the same. Frankly, if you showed me those two tables I would have to answer the cream causes the rash to increase for both tables(A, B).

The information available is not only the numbers in the tables, but also other information such as the starting groups are of equal size. That is implicitly stated when telling the subjects "Because patients do not always complete studies, the total number of patients in each of the two groups is not exactly the same...".

The subjects are also told one group of patients did not receive the cream. No mention is made of fooling them with a placebo in hopes they might come back even if cured. Just no treatment at all. No incentive to go back if it cleared up by itself.

Of those who returned in two weeks there are 298 who received the cream and only 128 who did not. A discrepancy of 170 patients. Before answering whether the cream works I am going to have to consider why so many of the untreated didn't return. In table (A) where 21 improved and 107 got worse I would expect the majority of the other 170 probably got better with no treatment and simply didn't go back. Why go back if you no longer need treatment? This would seriously increase the percentage who got better with no cream, making it a higher percentage than those who received the cream. So my answer would be cream increases rash. Similar thinking would apply to table (B) and would affect the percentage, but would still result in cream increases rash.

I really don't see how having subjects score a study of patients with freewill that may avoid providing their data can be related to subjects scoring a study of cities which always provide their data.

September 10, 2013 | Unregistered CommenterBob Koss

I think that mammography would make an interesting study because doing so would be a way of analyzing the influence of external environments on science communication. And how the evolving state of science knowledge affects scientists, practitioners, medical and pharmaceutical corporations, and the general public.

Underlying factors:

For years, some malpractice lawyers have made careers out of suing medical doctors for failure to recognize breast cancers which then rapidly metastasize, as if it were all the doctors fault that the cancer spread. The best defense for doctors was demonstration that their procedures held to some strict, professionally recognized protocol. Which meant, necessarily, that these doctors needed to hold all of patients to these protocols, regardless of what sense that protocol may have made to or for any individual patient.

Corporations did not seem to be very motivated to try to isolate and restrict usage of any products that might be implicated in cancer causation. In particular, pharmaceutical companies seemed to have seen (in the past) hormone replacement therapies for older women as a grand marketing opportunity.

Non profit groups, who have done much of the public educational outreach efforts, were probably quite motivated by statistics that showed that early diagnosis and treatment of cancer seemed to reduce the future number of serious cases. They also got funding that likely increased their propensity to focus on mammograms, medical treatments, and research for cures. "Komen", "For the Cure" for example seemed to do quite well, funding wise. And, I believe they were quite influential.

Thinking of breast cancer as a disease needing a Big Med fight, may help patients rationalize away personal responsibilities towards health, such as diet, exercise and a willingness to forgo the hormones and age normally.

Over time, of course more has become known about genetic propensities, such as BRCA1 and BRCA2 gene mutations, and also the idea that some teeny tiny cancers may not be on the verge of metastasizing at all and could be handled by watchful waiting similar to what is now advocated for testicular cancer. It became recognized that these not doing anything anyway "cancers" distorted the surgical cure statistics. Thus, treatments seem to be trending more and more towards decisions made on a case by case basis. Mammography, of course has the added twist that the diagnostic method itself may be a cause of future cancers.

I believe that the underlying issue here is one of how science presents itself. Too authoritarian an approach would, I believe, likely lead to rejection or partial rejection, of such authorities on the part of a public that can see counter examples. On the other hand, the best available medical science is a powerful tool that does provide us with much better outcomes than were available in the past. A certain amount of humility in the presentation by scientists and medical professionals would seem best to me for keeping communication lines open.

Thus, I think that there is some sense of reason in Brad Keyes' rejection of advocating for anthropogenic climate change based on scientific "consensus". Consensus is a politicized term. I believe that a more powerful argument can be made based on the very large body of evidence pointing towards anthropogenic climate change collected from scientists of differing backgrounds and research focuses, working independently. And how that evidence points towards the importance of taking action sooner rather than later.

I don't believe that we can provide the public with a "deliberative environment protected from the entanglement of cultural meanings and policy-relevant facts". I do believe that we can work towards one in which battles are not fought regarding policy related scientific knowledge based on sharp lines between science and the forces of anti-science. But rather that environments can be created in which current evidence is carefully weighed and considered and decisions can be made at the present time with full recognition that more knowledge will be available in the future. And part of that evidence is an open acknowledgement that different forces within society have differing motivations. What a democratic society selects as its direction isn't going to be done in an environment free of such biases, but does need to happen in a society that openly recognizes such biases, and is given the necessary background data to evaluate them.

September 10, 2013 | Unregistered CommenterGaythia Weis

@Gaythia,

As usual, you have presented a battery of novel, interesting, probably-insoluble riddles!

Your refreshing, witty juxtaposition of the two "sciences", medical and climate, leads me to pose a koan of my own.

If climate science is a science, then it must do what every other field of science does: it must add to human knowledge about nature over time. This is the operative definition of science, after all.

Now, as you know, there is a lot of climate science being "done", "carried out," by a lot of climate scientists all around the world, and they're generally called the world's "top" scientists. In recent years billions of research dollars have been allocated to this previously-unknown field, which we might otherwise have spent on, say, cancer research—but I digress.

And you're clearly an educated and high-information spectator of all this.

So you should have no difficulty at all answering the following.

Could you please name one (1) thing we know about the climate that we didn't know 5 years ago.

(If that's a bit too specific, let's say 10 years.)

I'm sure hundreds of discoveries spring to mind, but I'm only asking for 1 example.

September 10, 2013 | Unregistered CommenterBrad Keyes

Dan, I have changed my opinion about leadership and the roles that it played in the poisonous atmosphere of the climate change debate. Although I think that the UN mandating by definition that climate change was anthropogenic after accounting for natural variation, futher compounded by the IPCC with particular inability by the IPCC head, that although this record would help cause the poisonous atmosphere, I don't think it explains it.

Rather than consider it as one leadership failure, I am leaning towards it is more the fact of opposing successful leadership. The IPCC has been successful with politics. The sceptics have as well. The majority of people are in the "so what as long as it doesn't cost too much" as far as I can tell.

If you accept that people do know who to listen to, this would apply to leadership and the fact that the IPCC's work actually supports from encouraging CO2 production to severly limiting it.

What is your opinion of this success by both leadership functions?

September 13, 2013 | Unregistered CommenterJohn F. Pittman

John,

I humbly suggest that you might be overestimating the existence and influence of "leaders" on the non-believing side of the debate. In my case and according to most of the other non-believers I speak to, our position is a spontaneous personal response to the abysmal arguments put forward by the affirmative. It's now common knowledge that no shortage of scientists and other scholars agree with us, but that's beside the point—most of us (in my observation) rejected climate alarmism without any assistance from those or any other "leaders"—indeed, long before we knew that said "leaders" existed. To put it another way, I've read several believalist climate-science papers, but I don't think I've read any of the skeptical scientists' work (with the exception of the M&M refutations of Mann and his henchmenn, which I like to check just to make sure they're as devastating as they sound). So the various believalist counterattacks on the credibility of skeptical scientists elicit nothing but bemusement and pity from me. They spectacularly miss the point, and betray ignorance of how the burden of evidence works in science. Even if every single skeptical scientist and mathematician and economist and Monckton on Earth were incompetent and/or corrupt it wouldn't change a thing.

September 14, 2013 | Unregistered CommenterBrad Keyes

Incidentally, Bob Koss' critique seems spot-on to me from a medical point of view. Indeed, the numbers in the skin-cream example look unrealistic. In real life, an intervention trial of that size would be rendered uninterpretable, non-credible and effectively unpublishable by the dropout, without explanation, of so many patients ("lost to followup").

September 14, 2013 | Unregistered CommenterBrad Keyes

Even if every single skeptical scientist and mathematician and economist and Monckton on Earth were incompetent and/or corrupt it wouldn't change a thing.

Sorry, this demands some qualification.

If every single skeptical scientist and mathematician and economist and Monckton on Earth really were incompetent and/or corrupt, which is not the case as far as I can tell, then at least it would mean that Oreskes and all the other character assassins weren't lying to you about that. But since it's not the case, as far as I can tell, one has to say that Oreskes and all such character assassins are just engaged in despicable, defamatory demagoguery.

September 14, 2013 | Unregistered CommenterBrad Keyes

Brad, I understand your point. However, I think that the leadership for both groups reflect what works best for the culture that falls with in these camps. For individuals who look at and read the science, a consensus of scepticism would not mean much. Those who appear most influential reflect an individualist approach with their approach and presentation. The ones you call believalist are inclined to agree with the presentation and methodology of the IPCC.

Dan, I have concluded that the science of science communication belongs under a category atter a risk perception determination has occurred. This risk determination would need to occur along the lines that we know split camps into their respective positions. Whether gun control or climate change, correctly identifying the risk aversions and risk acceptance with respect to an issue, IMO, should be the first step in the science of science communication, and must pay particular attention to the kinds of inferences the public will tend to make wrt individualist, egal., etc. The other part is determination of how uncertainty effects these inferences. Examples are personal freedom versus top down regs, concretely measured values versus counterfactuals, "Ask" versus "Tell", etc.

Dan, your comment?

September 16, 2013 | Unregistered CommenterJohn F. Pittman

From your study:
The study presents both correlational and experimental evidence confirming that cultural cognition shapes individuals’ beliefs about the existence of scientific consensus, and the process by which they form such beliefs, relating to climate change, the disposal of nuclear wastes, and the effect of permitting concealed possession of handguns.

What I am thinking is that personal view of risk perception in particular is part of the process which leads to formation of such beliefs. I think about what in common belongs. When I converse with someone about a range of problems where they are like the stick figure in my mind: the person who on one hand condemns government surveillance, yet expects government to mandate health care at (their) affordable cost, I get a sense of the personal risk that they are comfortable with.

September 17, 2013 | Unregistered CommenterJohn F. Pittman

PostPost a New Comment

Enter your information below to add a new comment.

My response is on my own website »
Author Email (optional):
Author URL (optional):
Post:
 
Some HTML allowed: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <code> <em> <i> <strike> <strong>