follow CCP

Recent blog entries
popular papers

Science Curiosity and Political Information Processing

What Is the "Science of Science Communication"?

Climate-Science Communication and the Measurement Problem

Ideology, Motivated Cognition, and Cognitive Reflection: An Experimental Study

'Ideology' or 'Situation Sense'? An Experimental Investigation of Motivated Reasoning and Professional Judgment

A Risky Science Communication Environment for Vaccines

Motivated Numeracy and Enlightened Self-Government

Making Climate Science Communication Evidence-based—All the Way Down 

Neutral Principles, Motivated Cognition, and Some Problems for Constitutional Law 

Cultural Cognition of Scientific Consensus
 

The Tragedy of the Risk-Perception Commons: Science Literacy and Climate Change

"They Saw a Protest": Cognitive Illiberalism and the Speech-Conduct Distinction 

Geoengineering and the Science Communication Environment: a Cross-Cultural Experiment

Fixing the Communications Failure

Why We Are Poles Apart on Climate Change

The Cognitively Illiberal State 

Who Fears the HPV Vaccine, Who Doesn't, and Why? An Experimental Study

Cultural Cognition of the Risks and Benefits of Nanotechnology

Whose Eyes Are You Going to Believe? An Empirical Examination of Scott v. Harris

Cultural Cognition and Public Policy

Culture, Cognition, and Consent: Who Perceives What, and Why, in "Acquaintance Rape" Cases

Culture and Identity-Protective Cognition: Explaining the White Male Effect

Fear of Democracy: A Cultural Evaluation of Sunstein on Risk

Cultural Cognition as a Conception of the Cultural Theory of Risk

« Weekend update: Pew's disappointing use of invalid survey methods on GM food risk perceptions | Main | Science of Science Communication 2.0, Session 3.1: Science comprehension: who knows what about what—and how? »
Friday
Jan302015

Grading the 2015 version of Pew study of public attitudes toward science

So everybody knows that the Pew Research Center released a cool study yesterday on public attitudes toward science & on differences between public & scientists (or at least AAAS members; it’s worth noting that AAAS membership isn't limited to scientists per se).

It was a follow up to Pew’s classic 2009 study of the same -- & it makes just as huge and valuable a contribution to scholarly understanding as that one, in my view.

Lots of people have said lots of things already & will say even more. But here are a few thoughts:

1. Pew does great work in measuring US public attitudes toward science & scientists.  They ask questions that it is sensible to believe measure general public regard for the enterprise of science, and keep track over time.

When one adds their findings to those collected by the National Science Foundation and the National Opinion Research Center at the University of Chicago, which conducts General Social Survey, source of most of the NSF's annual "Science Indicator" measures, one can really form a good view of how the US public feels about science.  

People should ignore all the bogus studies that administer strange questions to M turk workers -- there are tons of those & they always report really weird, sensational findings.  

Who needs data? It's obvious!2.  This report, like the 2009 one, shows that Americans basically love science.  By overwhelming margins, they report admiration for scientists and positive appraisals of what scientists do.  This is consistent with what the NSF Science Indicators, which are released every year, show too.

3.  Still, there is almost this weird reluctance in the Center's press release and commentary to accept or clearly articulate this conclusion!

It's common wisdom that public disputes over science stem from a “creeping anti-science” sensibility in American society.

Scholars who actually study public attitudes toward science, however, know that that view is unsupported by any convincing, valid data.  Indeed, the Pew and NSF Indicator reports show that there is overwhelming trust-- across all demographic, political, and other types of cultural groups (religious & nonreligious, e.g.).  

The 2009 Report helped to try to correct the “common wisdom” in this regard.

But the 2015 Report seems committed to avoiding any confrontation with this view. Instead, by employing a strategy of silence, inapt juxtaposition, and emphasis of irrelevant data, the Center commentary seems committed to consoling those who hold this fundamentally mistaken understanding of the sources of public conflict over science.

mmmmmmm... science ....It's almost as if Pew feels disappointed to pop the balloon of self-reinforcing popular misunderstanding on this issue with the needle of its own data.

4. Consider the "gap" between scientists & public on evolution.  

Yes, it's there.

But it is well established that public opinion responses to the question “do you believe in human evolution” have zero connection to what people know about either evolutionary science or science in general. 

It's also perfectly clear that this "gap" in public and scientific understandings has nothing to do with public respect for scientists.  

The 2009 Pew Report made that clear, actually, reporting data showing that those who said they "disbelieved in" evolution as well as those who said they "did" both had highly positive views of science's contribution to society.

The Report and Alan Leshner’s commentary for the 2009 Report both emphasized that there was no meaningful differences in that regard between people who said science sometimes conflicts w/ their religious views & those who said it doesn't.

Nothing at all has changed--nothing. But is there anything comparable in this yr's report? Nope!

Leshner himself did write a very thoughtful commentary in Science. 

He's still championing respect for and respectful dialogue with diverse memers of the public: good for him; he's a real science-of-science-communication honey badger!

But even he seemed to think that getting his message across required indulging the "creeping anti-science" meme, warning that "the public's perceptions of scientific expertise and trustworthiness" risk being "compromised whenever information confronts people's personal, personal, or religious views"-- conclusions that actually seem completely contrary to the data presented in both 2009 and 2015.

5. Same w/ the “gap” on climate change.  

It's clear that climate change opinions don't measure either science comprehension, knowledge of climate science in particular, or respect for or attitudes toward science.  

In 2009, Pew wanted people to see, too, that public conflict over climate change did not originate in any disagreement about the value of science or trustworthiness of scientists. It emphasized that both climate-change "believing" & "disblieving" members of the public had the same positive views in this regard.

Not so in the 2015 report, where the "gap" on climate change is repeatedly used to qualify the finding that the public has high regard for science. (Interesting that only 87% of AAAS members indicated they "believe in" AGW; I'm sure they understand the evidence for AGW and even use the evidence "at work.")

What makes this all the more strange is that the 2015 Report recognizes that the public's disagreement over AGW mirrors a public disagreement over what scientific consensus is on this issue (a phenomenon that can be attributed to ideologically biased assessments of evidence on both issues).  

In other words accepters and nonacepters alike believe "science is on their side" -- much the way that nations at war believe that God is....

For sure, the debate is alarming and contrary to enlightened democratic decisionmaking.  

But if anyone thinks the source of the debate is lack of science comprehension on the part of the public or lack of public confidence and trust in science, they are themselves ignoring all the best evidence we have on these issues!

Pew's job is to help remedy this widespread form of science-of-science-communication illiteracy.

6. The data reported on public attitudes on GM foods is super disappointing.  

Social scientists know that surveying the public on issues that it has never heard of generates absolutely meaningless results.

GM food risks are in that category. 

Consider:

American consumers’ knowledge and awareness of GM foods are low. More than half (54%) say they know very little or nothing at all about genetically modified foods, and one in four (25%) say they have never heard of them.

Before introducing the idea of GM foods, the survey participants were asked simply ”What information would you like to see on food labels that is not already on there?” In response, most said that no additional information was needed on food labels. Only 7% of respondents raised GM food labeling on their own. . . .

Only about a quarter (26%) of Americans realize that current regulations do not require GM products to be labeled.

Hallman, W., Cuite, C. & Morin, X. Public Perceptions of Labeling Genetically Modified Foods. Rutgers School of Environ. Sci. Working Paper 2013-2001. 

Americans don't fear GM foods; they eat them.

No meaningful inferences whatsoever can be drawn from the "gap" in attitudes between members of public & scientists on this issue.  

Very un-Pew-like to play to common misunderstandings about this by treating the “gap” between public and scientists on GM as supporting any meaningful inferences about anything.

6.  Also very out of character is Pew's calling attention to minute changes in the overwhelming levels of support for science reflected in particular items:

I'm sure they were just trying to throw a bone to all those who "just know" -- b/c it's so obvious-- that we are living in the "age of denial." But if the latter seize on these changes as meaningful, they'll only be making fools of themselves. 

For perspective, here are comparable data, collected over time, from NSF Indicators (click on them for larger displays).  


That anyone can see in these sort of data evidence for a "creeping anti-science" sensibility in the general US population or any segment of it is astonishing -- something that itself merits investigation by public opinion researchers, like the excellent ones who work for Pew!

But the bottom line is, the job of those researchers isn't to feed these sorts of persistent public misimpressions; it's to correct them!

* * *

How would I grade the Pew Study, then?

“A” for scholarly content.”

“C -” for contribution to informed public understanding.

Overall: “B.”

PrintView Printer Friendly Version

EmailEmail Article to Friend

Reader Comments (7)

As I said on Twitter, your points are well-taken but I am left slightly confused about A) Whether you think there is a problem, in general, with public acceptance of scientific evidence on various fronts and B) If there is a problem on some fronts, and the solution is NOT education, what is it? Integration of social groups? Educating key figures in a group ?

January 30, 2015 | Unregistered CommenterAmy Harmon

@Amy:

thanks for comment! Actually for the one on Twitter that started things b/c it made me reflect more on what exactly it is that disappoints me about the Pew Report (especially I do means sincerely that it deserves an "A" for useful collection of data). I'll try to get to that, though, in answering your specific questions (or taking a first pass at doing so).

A. I think there is a big problem, not necessarily with "acceptance/nonacceptance" but with a state of public interaction & deliberation that predictably inteferes w/ exercise of the reasoning capacities that people use to figure out what is known to science.

People, in order to live well, need to recognize and give effect to much more scientific knowlege than they coudl possibly be expected to comprehend and verify; so they become experts at figuring out what is known by science. That is a genuinely awe-inspiring capability & for sure it involves & evinces reasoning.

My view, based on mainly on assessments of evidence that those studying the science of science communication, myself included, have assembled is that this form of reasoning is disrupted when risks and other facts that admit of scientific invesetigation become entangled in antagonistic cultural meaningsthose who know the most science and are best able to make sense of it end up the most polarized.

That is obviously a problem. Or in any case, that *is* the problem as I see it.

B. For this reason likely, you can see why it strikes me as obviuosly the wrong conclusion to say that the problem is to be solved with "more science education."

That gets things exactly upside down. The problem, which is a form of cultural conflict that *disables* reasoning essential recognizing & using scientific knowledge, has to be solved in order for individuals & society as a whole to get the benefit of their science literacy, which in this country is actually pretty high.

The "solution," at a high level of generality, is to protect a science communication envirionment protection program. We need to build the institutions & adopt the procedures & practices that would enable us to use our best scentific undestanding of science communication to preempt formation of the influences that generate the sort of conflict subversive of our reason; we need to perfect the techniques,, too, that would allow us to decontaminate the environment when preventive measures fail.

Probably you will say that's too general. It is. But I & many others have said more specific things; we have studied things that are more specific-- in the lab & in the field.

When people say to me, "well you don't offer any solution, so what choice do we have but to keep working on education," I undertand that to be a reflection of whatever the cultural influences are that make our society resist understanding that the problems we have in this regard are *not* a consequence of a lack of science education. Not knowing that there are other things, not knowing that people are studying & trying them and even getting somewhere with them -- that is part & parcel of the persistence of the sort of fog that cloulds our collective thinking about our science communication problme.

Which brings me to Pew. I expect it not to add to the fog; I expect it to use its data as a beacon, to illuminate the mistakes and also the path forward.

It fills me with disappointment when, as I feel is happening here, they charcterize their findings in a way that they know will fit rather than challenge preconceptions, particularlly of those who dispaly a mysterious willful resistance to learning that their assumptions about science communication & their related "social marketing" campaign strategies are completely contrary to the evidence on what the probelm is & what will fix it.

The first duty of science communciation is to tell people what they must know, not what they want to hear.

Pew is flinching.

January 30, 2015 | Registered CommenterDan Kahan

I'm curious what your take is on Q.37: "When you are food shopping, how often, if ever, do you LOOK TO SEE if the products are genetically modified?" Fifty percent of respondents answered either "Always" or "Sometimes." (48% said "Not too often" or "Never.")

The question immediately preceded the "safe or unsafe" question (Q.38), but followed the prompt: "Scientists can change the genes in some food crops and farm animals to make them grow faster or bigger and be more resistant to bugs, weeds, and disease."

Do you have a sense of why they answered this way? I don't believe them, but I'm trying to figure out what exactly went wrong, since Pew asks about past behavior, not general opinion. Are these people thinking, "well, that sounds kind of bad, so any time I saw a 'Here Be GMOs' label, I'd surely not buy that food"? Or, "Hey, I can think of a time I looked for GMOs"? Or what?

January 30, 2015 | Unregistered CommenterMW

MW -

FWIW, i am dubious as well...i think the chronology/proximity of the questions could well be a factor.

Don't think that speculating is very useful, however... except as a factor for future research.

January 30, 2015 | Unregistered CommenterJoshua

@MW

Wow-- nice catch!

You shouldn't believe the rspts, that's for sure, b/c in fact there isn't any information on food labels relating to GMO use.

The answers are consistent with the report that I quoted that shows that people assume GMO labeling already exists.

The very fact that 50% of the rspts would make a patently absurd claim to be inspecting labels for information that doesn't exist helps to show that the risk item was not measuring any view that exists in minds of people who aren't being asked a question they don't understand.

It's really sad that Pew not only doesn't use these their data to help people understand this point but actually itself exploits public misunderstanding about invalidity of public opinion data on matters that people in the real-world don't have a clue about.

January 30, 2015 | Registered CommenterDan Kahan

DK --

To be fair to the people I still don't believe, more and more companies are voluntarily putting "No GMOs" labels on their products, per this article in today's NYT. (I noticed one on my Amy's mac & cheese the other day. I'm pretty sure Amy adds a mg of sodium for each natural base pair.) So people could, theoretically, be looking for that, although I really, really doubt half of them are, and the report you cited strongly suggests that's not what's going on. I can't wait for the "No GMOs" label to appear on Diet Coke.

January 30, 2015 | Unregistered CommenterMW

@MW:

The people who are being treated unfairly are not the survey rspts; it's the people who are reading the survey & not being told by Pew, which really does understand the difference between a valid & an invalid survey item, that survey questions about GM food risks don't genuinely measure what people in the real world are thinking. Pew's job is to correct misimpressions, not feed them.

January 30, 2015 | Registered CommenterDan Kahan

PostPost a New Comment

Enter your information below to add a new comment.

My response is on my own website »
Author Email (optional):
Author URL (optional):
Post:
 
Some HTML allowed: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <code> <em> <i> <strike> <strong>