follow CCP

Recent blog entries
popular papers

Science Curiosity and Political Information Processing

What Is the "Science of Science Communication"?

Climate-Science Communication and the Measurement Problem

Ideology, Motivated Cognition, and Cognitive Reflection: An Experimental Study

'Ideology' or 'Situation Sense'? An Experimental Investigation of Motivated Reasoning and Professional Judgment

A Risky Science Communication Environment for Vaccines

Motivated Numeracy and Enlightened Self-Government

Ideology, Motivated Cognition, and Cognitive Reflection: An Experimental Study

Making Climate Science Communication Evidence-based—All the Way Down 

Neutral Principles, Motivated Cognition, and Some Problems for Constitutional Law 

Cultural Cognition of Scientific Consensus
 

The Tragedy of the Risk-Perception Commons: Science Literacy and Climate Change

"They Saw a Protest": Cognitive Illiberalism and the Speech-Conduct Distinction 

Geoengineering and the Science Communication Environment: a Cross-Cultural Experiment

Fixing the Communications Failure

Why We Are Poles Apart on Climate Change

The Cognitively Illiberal State 

Who Fears the HPV Vaccine, Who Doesn't, and Why? An Experimental Study

Cultural Cognition of the Risks and Benefits of Nanotechnology

Whose Eyes Are You Going to Believe? An Empirical Examination of Scott v. Harris

Cultural Cognition and Public Policy

Culture, Cognition, and Consent: Who Perceives What, and Why, in "Acquaintance Rape" Cases

Culture and Identity-Protective Cognition: Explaining the White Male Effect

Fear of Democracy: A Cultural Evaluation of Sunstein on Risk

Cultural Cognition as a Conception of the Cultural Theory of Risk

« Weekend update: Are GM foods toxic for the science communication environment? Vice versa? (video lecture & slides) | Main | Science of Science Communication 2.0, Session 2.1: What is science literacy? And what is it for? »
Friday
Jan232015

Should we care about the public's *climate science* literacy? What is "ordinary climate science intelligence" *for*?

A friend of mine posed this very appropriate, very basic,  very important question/challenge to me after reading the Measurement Problem:

One thing i was left wondering after reading the paper was "should we care about people's OCSI ["Ordinary Climate Science Intelligence"] scores"? does "climate science comprehension" matter for anything meaningful? should we invest (any) resources in trying to increase people's science comprehension in this arena? and if so, WHY? does it, for example, correlate with people's ability to meaningfully/productively engage in the sorts of collective decision-making and planning processes happening in southeast florida? or do you simply see it as a valid end, not a means to anything on the "confronting climate change" front? i'm not sure what your take is on this, exactly, as it seems at some points that you do advocate communicators learn how to do exactly this (improve comprehension), perhaps by looking to science teachers who have figured out effective strategies in other contexts (evolution); but at other points (i think) you say that climate sci comprehension has nothing/little to do with the cultural polarization that seems to inhibit any large-scale collective response to the issue (which, as you say, should come as little surprise). or perhaps both of these statements are true to your thinking, but then it's less clear why (for what purpose) you advocate the former (coming back to the question of, "what is climate science comprehension FOR in a culturally polarized world?"). 

My answer:

OSCI items & responsive curves-- click it! On Ordinary Climate Science Intelligence (OCSI) assessment scores: I think the scores are useful for exploring questions and testing hypotheses about why there is public conflict over climate change.

E.g., is the source of public conflict on this issue attributable to differences in comprehension of either the rudimentary mechanisms of this climage change? 

Or perhaps to their “unfamiliarity” with the evidence that climate scientists have compiled on the causes and consequences of it?

OCSI can help to answer those questions: because it shows that members of groups polarized over the existence of climate change and the contribution humans are making to it have comparable understandings (and misunderstandings) about those matters, it gives us less reason to credit those explanations than we’d have otherwise.

Indeed, OCSI helps us to see, too, that survey items that assess public “acceptance” of human-caused climate change simply aren’t measuring anything having to do with knowledge of climate science at all.  They have no correlation whatsoever to scores on a rudimentary climate literacy test and instead cohere with—behave exactly like other observable indicators of—their cultural identities.

As I explain in The Measurement Problem, OCSI was meant essentially to be a model of an assessment test, one designed to examine whether it is possible, with appropriately worded items, to disentangle (unconfound) cultural identity & knowledge when measuring how much people know about climate change.

But if we want to measure people's understandings of climate science, then I am sure it would be possible to do better than OCSI!

Your question, I take it, is actually more basic: given what we can see from OCSI, why should even want to measure how much people understand about climate science? Why should we care?

There's no answer, of course, that doesn’t presuppose some sort of normative goal.

The goal of most people who collecte data on the issues we are interested in is simply to "move the needle" of public opinion on "belief in" human-caused climate change.  For them, I guess the results of the paper suggest that they should "not care" whether anyone comprehends anything meaningful about climate change. Because what they “believe” about human-caused climate change turns out not to have anything to do with what they know.

Indeed, the paper shows that, if the goal is simply the instrumental one of generating public engagement with the issue of climate change, advocates should stop obssessively measuring and minutely analyzing the percentage of the public who say they believe in (accept) human-caused climate change

Again, what people say about that is measures only of their cultural identities—who they are.  Their response to “do you believe in human-caused climate change” questions not only don’t measure what they know.  They don’t even measure whether they are worried and concerned about climate change!

The consistently wrong answers most believes & skeptics give about the extent and nature of the dangers posed by climate change (e.g., that it will cause increases in skin cancer, or prevent plant photosynthesis) strongly suggest that believers and skeptics alike (in the general public at least) are very alarmed, as an emotional or affective matter, about the risks human-caused climate change poses.

What those who are trying to mobilize public opinion in this way should be trying to figure out is why their style of advocacy doesn’t tap into this reservoir of concern but instead reliably, predictably, inevitably triggers the identity-protective response that is reflected in the “No, I don’t, you asshole!” answer that 1/2 the US public gives when asked (over & over & over in polls that aren't advancing anyone's understanding of anything at this point) “do you believe in human-caused climate change?” 

But there are bunch of other goals one could have—I’d say, should have—besides the navel-gazing one of “needle moving.” All of them support developing an even better instrument for assessing what ordinary  people know about the science of climate change.

One is to help ordinary members of the public recognize information important to the decisions that they will make as citizens in self-governing communities the welfare of which will be affected by actions they take in relation to a changing climate.

Another would be to create communication materials that make it possible for the relatively small portion of the population who is genuinely curious about what we know about a changing climate to satisfy that interest.

Another would be to educate young people who might, if they are taught well and made excited by what they learn, become either climate scientists or adults who are genuinely curious about what we know and who, in any event, will become people who need to make decisions informed by the best evidence in their own private lives (as, say, property owners or business people; or farmers) or as citizens whose communities will be affected by climate change.

For all of those goals and related ones, then there will be value in having not an "OCSI" but a variety of them suited for the goal at hand.  

As I said in the paper, e.g., I think it is silly to measure whether citizens  know that the North Pole ice cap melting won't cause flooding; it's enough for them to know that melting ice sheets are creating a risk for people as a result of climate change.

They know that!

But if one's goal is to educate young people, "North Pole" might be a pretty good item -- which is to say, it might actually be contributing to measurement of the latent comprehension capacity that educators should be trying to instill.  So there are values in having OCSIs for these purposes that are actually tied to the sort of knowledge and comprehension capacities that it makes sense for those transmitting scientific information to focus on in the context in which they are operating.

The reason it would be nice to have on OCSI for the "curious consumer of science," too, is so that those who are part of the (truly amazing!) profession that is committed to serving him or her can figure out whether their efforts are working as well as they want.  

For all of these actors, there will be domain-specific “OCSIs” better than the experimental one featured in The Measurement Problem.  But I hope that the experimental OCSI can help those developing these practical, real-world OCSIs to see that they can and must “disentangle” identity and knowledge in constructing them. 

Well, those are my thoughts.  What do you think?

PrintView Printer Friendly Version

EmailEmail Article to Friend

Reader Comments (16)

Dan -

==> "What those who are trying to mobilize public opinion in this way should be trying to figure out is why their style of advocacy doesn’t tap into this reservoir of concern but instead reliably, predictably, inevitably triggers the identity-protective response that is reflected in the “I don’t” response that 1/2 the US public gives to the “do you believe in human-caused climate change” item that those advocates obsesses over."

There are folks on both sides of the climate wars who are trying to mobilize public opinion. There are also folks on both sides who are trying to mobilize public opinion by providing a particular subset of the evidence, or a particular conclusion w/r/t the balance of the evidence, or a particular view w/r/t the validity of different pieces of evidence. Both sides, of course, claim that all they're trying to do is present evidence in an unbiased fashion. to present "the truth," and to expose the "hoaxes" and "frauds" and "religious fanatics" and "vested interests" on the other side, respectively.

But it seems to me that in your description, you're describing the situation as if these phenomena only exist on one side of the climate wars.

==> "Well, those are my thoughts. What do you think?

I'm not sure that you really answered the question.

==> "One is to help ordinary members of the public recognize information important to the decisions that they will make as citizens in self-governing communities the welfare of which will be affected by actions they take in relation to a changing climate."

I don't understand this. Unless you address the mechanisms of polarization, then how will ordinary members of the public recognize important information in an unbiased fashion? IMO, the mechanism isn't that unpolarized people take biased information and then react in polarized ways. The context is already polarized - so what happens is polarized people take whatever information pretty much however it is communicated, and use it to reinforce their identity-related views of climate change.

For example, sure, the identity orientation of "skeptics" is stimulated when someone calls them a "denier." But their identity orientation is also stimulated when a climate scientist says that there is uncertainty in what kinds of trends to expect in the short term.

http://wattsupwiththat.com/2010/01/11/ipcc-scientist-global-cooling-headed-our-way-for-the-next-30-years/

I, also, don't really understand what you hope to achieve by improving the method of communicating the science, because that doesn't address the root of the problem.

==> "Another would be to create communication materials that make it possible for the relatively small portion of the population who is genuinely curious about what we know about a changing climate to satisfy that interest."

Well, sure. But I would say that people who are genuinely curious about what we know about a changing climate already have the information they need, because people who fit that description are necessarily able to apply strategies that will help them control for "motivations" in how they interpret evidence. They can look at evidence that might stimulate identity-related reactions in others and not get distracted. There is no information that could be free from a projection of identity-related reactions. Information in the climate wars, of any type, become ink blot tests. Richard Muller produces BESTs analysis, and folks on both sides reverse engineer to come to opposite conclusions about whether he's a "skeptic." In other words, people take an ill-defined concept (what is a "skeptic") and overlay it on to an ambiguous context (Muller's views are somewhat ambiguous and not entirely consistent) in order to reinforce the "use" vs. "them" conceptualization that drives their engagement.

I think that you may be limiting the direction of causality here - it doesn't just flow in one direction.

January 23, 2015 | Unregistered CommenterJoshua

I think you shouldn't give up on processes of continuing education. Older people can benefit from education too.

January 23, 2015 | Unregistered CommenterGaythia Weis

@Joshua:

There are people deliberately trying to generate public conflict & confustion.

I wasn't addressing them; I think they are dong an excellent job, sadly.

I am talking to/about folks who I take at their word are trying to improve the quality of public engagement with climate science. They need to measure what counts.

January 23, 2015 | Registered CommenterDan Kahan

Dan -

I guess I don't see how to make your distinction.

Everyone is saying that their goal is to improve the quality of public engagement with climate science. I don't see any objective measures to assess the validity of their claims (although subjective measures abound).

Still....there's that issue of causality. IMO, the way to address the problem is to raise meta-cognition about the processes of motivated reasoning. Reactions to communication of science are they symptoms of the underlying disease.

January 23, 2015 | Unregistered CommenterJoshua

@Joshua:

I think some actors clearly intend to promote conflict. You suggested I should be criticizing them. Okay: "Dickheads: Please stop it." But why should they listen? They have ends different from mine. Those who want to dissipate conflict are being misled by people giving them very bad advice about the sources of such conflict.

I'm not sure what you mean when you say there isn't "any objective meaures to assess ... validity." People collective evidence, the validity and weight of which is to be assessed in exactly the same way the validity & weight of anything else that is subject to empirical testing is to be evaluated. Evidence doesn't evaluate itself -- not in any are of inquiry or on any topic. It is evaluated by people & on the basis of theories & reasoning strictures that are themselves independent of that evidence.

And an integral part of that entire process is critical, candid appraisal of the evidence & theories & reasoning that people are using.

What more can I say? What more could you possibly be asking for?

January 23, 2015 | Registered CommenterDan Kahan

"Evidence doesn't evaluate itself -- not in any are of inquiry or on any topic. It is evaluated by people & on the basis of theories & reasoning strictures that are themselves independent of that evidence. And an integral part of that entire process is critical, candid appraisal of the evidence & theories & reasoning that people are using."

If there are such methods, then wouldn't they be applicable to the controversial science as well? Couldn't you apply them to, say, climate change evidence and get an independent evaluation of its quality?

The science of science communication applies just as much to communicating the science of science communication itself, as well. Once a particular science communication result has implications for people's identities, all the same problems arise, and you get people dismissing or ignoring your evidence in just the same way. They know the methods for critically validating evidence, but choose not to apply them because the conclusions match their prior expectations or desires.

I agree that there's no point in talking to or about the people intent on causing confusion. I don't think there's much point in talking to those who want to "move the needle", except to say your evidence doesn't support their theories. Talking to and about those who want to better understand what's going on is a valid aim, although like Joshua I don't think you really answered the question as to what people need an understanding of climate science (or most other topics within science) for. Unless you was saying that with regard to "the public" as a whole they don't, and your interest is only in the smaller subset who want to know. In fact, I think there's rather more of an argument for them to know about the science of science communication, than climate science. Knowing that you're biased is a basic requirement for being able to take measures to deal with it.

We don't care whether the general public is literate in climate science - it's just an example of a controversial topic for study. We do care that those few who happen to be interested know how to assess the evidence scientists present them with, and avoid the biases. Am I interpreting your intent correctly?

January 24, 2015 | Unregistered CommenterNiV

In an earlier thread I noted how, while watching The Science Network's Beyond Belief series, I became aware of just how remote many scientists are from the opinions, attitudes, beliefs and values of the general public. Even though they may be highly intelligent when it comes to their narrow specialties of study, when it comes to social-moral intelligence, on a scale of 0 to 10, I would give most of them a 0. Those like Einstein who score high in both scientific intelligence and moral-social intelligence are indeed a rare breed.

The veteran pollster Daniel Yankelovich speaks of the problem in Coming to Public Judgment: Making Democracy Work in a Complex World:

In recent years my work has made me conscious of the enormity of the gap that separates the public from the experts. As an interpreter of public opinion, I serve as a go-between for the two worlds of public opinion and expert policy making. Each year the distance between the two worlds grows greater....

Often without realizing it, [the experts] impose their personal values on the country because they fail to distinguish their own value judgments from their technical expertise....

Yankelovich goes on to explain that most experts subscribe to the the information-driven model. As he observes:

The information-driven model leads to a concept of public education as a one-way process: the expert speaks; the citizen listens. Questions may arise about the best technique for grabbing the public's attention and conveying the relevant information. But conceptually, the model is simple and unidirectional: the expert's role is to impart information to the pubic skillfully and effectively; the citizen's role is to absorb the information and form an opinion based on it.

Layered on top of this is the arrogance, pompousness, self-righteous piety and overblown egos of entirely too many who inhabit the ivory tower. At The Science Network "Beyond Belief" conferences, and especially the first one, this attitude was so thick in the room one couldn't cut it with a knife. (For someone like myself who hails from the working class -- from the wrong side of the tracks one might put it -- and is very sensitive to these sorts of power plays and mind fucks since I have been subjected to them for much of my life, I can spot these self-absorbed, holier-than-thou, imperious dipshits like Richard Dawkins, Sam Harris, Daniel Dennett, Sir Harold Kroto and Lawrence Krauss from a mile away.) As Yankelovich continues:

Their knowledge and interests are specialized. Their day-to-day contact with the general public is meager. They belong to distinct subcultures, each with its own outlook. Often they are graduates of elite colleges and universities, which indoctrinates them with a noneradicable feeling of superiority to the general public. Without questioning the depth of their attachment to democracy, in their personal lives many have adopted the outlook of a ruling social class, and though their attitudes may be benign, their life-styles create a vast social distance between themselves and average Americans.

Many of them are aware of how remote their contact is with middle America and are eager to learn how they can better communicate their views to the larger public. They assume they have much of value to communicate to the public, without imagining that the public has much of value to impart to them. One of the most severe drawbacks of the conventional model of quality-as-information is that it always assumes a one-way flow of wisdom from those with more information to those with less.

And with a few notable exceptions, most attendees at the "Beyond Belief" conference allowed these domineering, high-handed social-moral ignoramuses to go unchallenged.

This is why when Yankelovich formulates his "ten rules for resolution," rule #2 is "DO NOT DEPEND ON EXPERTS TO PRESENT ISSUES." This comes right after rule #1: "On any given issue, it is usually safe to assume that the public and the experts will be out of phase. To bridge the gap leaders must learn what the public's starting point is and how to address it." Of course when society is burdened with an expert class which "assumes they have much of value to communicate to the public, without imagining that the public has much of value to impart to them," learning what the public' concerns are becomes an insurmountable obstacle.

January 24, 2015 | Unregistered CommenterGlenn Stehle

Dan -

==> "I think some actors clearly intend to promote conflict."

I would agree that some do - purely on a theoretical framework. In theory, among the entire group, it is likely that some intend to promote conflict.

But again theoretically (my own theory), I would guess that group to be small. I think that the "intention" of most folks is to be right, to prove themselves smart, to find truth, and to affirm their sense of self and the idealized notion of the group they associate with.

I don't know how to evaluate the evidence to ascertain the relative sizes of the first group and the second. Further complicating it, there's no reason to assume that the groups are mutually exclusive: likely some folks promote conflict as a tactic to affirm identity, find truth, etc.

==> "You suggested I should be criticizing them. Okay: "Dickheads: Please stop it." "

:-) Yeah, that will work!

==> "I'm not sure what you mean when you say there isn't "any objective meaures to assess ... validity." People collective evidence, the validity and weight of which is to be assessed in exactly the same way the validity & weight of anything else that is subject to empirical testing is to be evaluated."

How do we empirically test/evaluate who is trying to promote conflict and who is trying to dissipate it? Everyone, basically, says that dissipating conflict (by virtue of uncovering the "truth" about the science) is their goal. They all say that their goal is to mitigate the impact of the "disinformers" and those who, unlike themselves, want to starve children in Africa.

==> "Evidence doesn't evaluate itself -- not in any are of inquiry or on any topic. It is evaluated by people & on the basis of theories & reasoning strictures that are themselves independent of that evidence."

So what is the theory for how to identify those who are intending to create conflict?

==> "What more could you possibly be asking for?"

Stated above.

January 24, 2015 | Unregistered CommenterJoshua

Me:

==> " IMO, the way to address the problem is to raise meta-cognition about the processes of motivated reasoning."

NiV:

==> "In fact, I think there's rather more of an argument for them to know about the science of science communication, than climate science. Knowing that you're biased is a basic requirement for being able to take measures to deal with it."

Armageddon?

January 24, 2015 | Unregistered CommenterJoshua

Stopped clocks...? ;-)

I think I've always agreed with you on that. But that just means I have to find somebody else who doesn't agree, to test my arguments out on.

January 25, 2015 | Unregistered CommenterNiV

Odds of two clocks both stopping at the same precise moment?

Armageddon more likely, IMO.

January 25, 2015 | Unregistered CommenterJoshua

http://www.telegraph.co.uk/news/earth/environment/globalwarming/11395516/The-fiddling-with-temperature-data-is-the-biggest-science-scandal-ever.html

http://www.climatedepot.com/2013/01/10/meteorologist-anthony-watts-on-adjusted-us-temperature-data-in-the-business-and-trading-world-people-go-to-jail-for-such-manipulations-of-data/

Your questions about WHY people did not believe propaganda and lies does not make room for the "It was a hoax"-oh well.

Next subject vaccines!

February 9, 2015 | Unregistered Commenteritwasalie

@Itwasalie

If you can't see why, assuming you are right, you have just as much of a puzzle to explain about who did or does believe, you should think harder about what the data show on public opinion, climate change risk perceptions, & science comprehension.

February 9, 2015 | Registered CommenterDan Kahan

Hi Dan,

I have admired your work for some time now, and enjoy your blog.

I understand your fundamental argument about the 'knowledge deficit' problem, but there is one aspect of this issue that has been nagging at me -and it is definitely a matter of scientific 'meta-literacy', as John Neilsen-Gammon puts it. Please allow one-paragraph to set a context:

About ten years ago, while working as a forex trader, I began to read up on climate change during the many dull hours watching charts wiggle about. I began reading as a sceptic, and read right across the spectrum, from WUWT to Real Climate, and of course IPCC reports themselves. The difference in tone and the quality of argument was apparent -although not hugely asymmetrical- but there was one really striking asymmetry:

The anti-AGW sites constantly tried to falsify climate science through misrepresentations of complexity, accumulation and statistics. There was a constant reference back to Popper and generally positivist models of science, and when statistics was used, it seemed to be used in ways that were not appropriate for atmospheric science.

This all got me thinking about the problem of using scientific knowledge (or at least seemingly authentic rhetoric) from one discipline to try and disprove another discipline. I am reminded of Myanna Lahsen's observations of the first generation of anti-AGW scientists -all guys from the Reagan era who were 'back of the envelope' physicists, and simply dismissed climate science due to its inherent uncertainties and use of global climate models.

No doubt we can all think of many ways in which people misunderstand the statistical nature of the world around us; it is even more common that people misunderstand the nature of accumulation in complex, open systems.

The studies you refer to show there is not much difference between the scientific literacy of those on either side of the AGW debate -but what of the meta-scientific issue of understanding probabilistic science -and indeed, the (also probabilistic) process by which consensus forms?

Perhaps the more important knowledge deficit is not about key points of knowledge, but about the underlying statistical ontology that orients us to what kinds of things climate science can reasonably know -and therefore what kinds of criticisms are not realistic. Maybe "public understanding of statistics" is a worthwhile knowledge deficit to tackle?

February 17, 2015 | Unregistered CommenterMark Ryan

Dan,

What is your take on the Fox News article about your work? I don't think it says what they think it says. But I haven't seen the latest.

http://www.foxnews.com/science/2015/02/12/study-global-warming-skeptics-know-more-about-climate-science/

Thanks.

February 18, 2015 | Unregistered CommenterMidwest student

@Midwest:

This.

&

This.

The study doesn't show that those who score the highest on the battery of items used in the study are more likely to be skpeitcs.

It shows that they are the ones who are most politically polarized on AGW.

(Other writeups on study have emphasized that. e.g., here & here & here.)

February 18, 2015 | Registered CommenterDan Kahan

PostPost a New Comment

Enter your information below to add a new comment.

My response is on my own website »
Author Email (optional):
Author URL (optional):
Post:
 
Some HTML allowed: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <code> <em> <i> <strike> <strong>