follow CCP

Recent blog entries
popular papers

Science Curiosity and Political Information Processing

What Is the "Science of Science Communication"?

Climate-Science Communication and the Measurement Problem

Ideology, Motivated Cognition, and Cognitive Reflection: An Experimental Study

'Ideology' or 'Situation Sense'? An Experimental Investigation of Motivated Reasoning and Professional Judgment

A Risky Science Communication Environment for Vaccines

Motivated Numeracy and Enlightened Self-Government

Ideology, Motivated Cognition, and Cognitive Reflection: An Experimental Study

Making Climate Science Communication Evidence-based—All the Way Down 

Neutral Principles, Motivated Cognition, and Some Problems for Constitutional Law 

Cultural Cognition of Scientific Consensus
 

The Tragedy of the Risk-Perception Commons: Science Literacy and Climate Change

"They Saw a Protest": Cognitive Illiberalism and the Speech-Conduct Distinction 

Geoengineering and the Science Communication Environment: a Cross-Cultural Experiment

Fixing the Communications Failure

Why We Are Poles Apart on Climate Change

The Cognitively Illiberal State 

Who Fears the HPV Vaccine, Who Doesn't, and Why? An Experimental Study

Cultural Cognition of the Risks and Benefits of Nanotechnology

Whose Eyes Are You Going to Believe? An Empirical Examination of Scott v. Harris

Cultural Cognition and Public Policy

Culture, Cognition, and Consent: Who Perceives What, and Why, in "Acquaintance Rape" Cases

Culture and Identity-Protective Cognition: Explaining the White Male Effect

Fear of Democracy: A Cultural Evaluation of Sunstein on Risk

Cultural Cognition as a Conception of the Cultural Theory of Risk

« Could use a little help from my friends: new working paper on disgust, GM food & childhood vaccine risk perceptions | Main | Still another metacognition question »
Monday
Jan022017

Cultural cognition? Oy! What's a science journalist supposed to do?...

I received this pieces of correspondence from a science journalist, who puts the emminently reasonable question, So what do I, as science journalists, do to combat or avoid the forms of toxic polarization associated with cultural cognition? I offer a few leads in my response, but it occurred to me that the most likely way that Dieter would get a fully satisfying answer would be to invite the 14 billion (with Dieter, make that 14 billion & one) reader of this blog to weigh in.

So read read this earnest science journalist's note & give him your 2 cents worth (it's not much but it can really add up if anything close to all 14 billion of you reply).

The question:

Dear Mr. Kahan,

I'm a belgian science journalist working on a presentation about communicating about scientific topics that tend to polarize society (nuclear power, gmo's, vaccines,...). The public will mainly consist of scientists and science communicators.

While looking for information about this I came across your name and some of your research on cultural cognition and I must say it has been a real eye-opener. I'm one of those people who thought it is mainly about spreading the facts. And your research seems to imply this is all wrong. A question that has however so far remained unanswered, is what this means for my work as a science  journalist. What can I do to get it right? What should the scientists themselves pay attention to? Could you be so kind to direct me to your papers that are most relevant for answering these questions?

Thanks in advance.

Kind regards, ...

 My response:

Oh sure, ask me an easy question, why don't you?!

PrintView Printer Friendly Version

EmailEmail Article to Friend

Reader Comments (5)

Ain't that just the question?

I don't think we have any good answers about what kind of science journalism has the power to persuade people who believe things that aren't supported by the preponderance of the evidence. So I start with making sure *I* don't believe anything that isn't supported by the preponderance of the evidence.

Science journalists, and scientists, are just as subject to fact-resistance as the rest of humanity, and I think the first thing we can do is try our best to check our own biases. So, I talk to the smartest people who disagree with me. I find sources who freely acknowledge both the pluses and minuses when we're talking about complex topics that inevitably have at least some reasonable arguments on both sides. I try, over time, to give my readers a reason to trust my take on these issues.

It's not a silver bullet, but it's the best I can do.

January 2, 2017 | Unregistered CommenterTamar Haspel

"I'm one of those people who thought it is mainly about spreading the facts. And your research seems to imply this is all wrong."

First the general advice...

Scientific journalism should be an extension of the scientific debate into a wider circle of the public, and the scientific debate is about checking claims from many different perspectives to enable errors and misunderstandings to be eliminated, and to find deeper insights and better explanations. The role of a scientific expert is to provide a clear and concise summary of the evidence for a claim, and the role of a scientific journalist is to collate the information from all the experts - to do the legwork of finding out what the best competing positions and arguments are, and collect all the material needed to understand the issues and come to a judgement for themselves in one place.

It is NOT to make a judgement about which position is correct, and present the conclusion with no explanation of the evidence, issues, uncertainties, or opposing positions - which is the mistake a lot of science journalists make when they talk about "spreading the facts". Whose "facts" do you mean? Why should anyone believe them? Or you?

The aim in science is to construct a chain of argument with no holes or invalid steps in it, each simple enough for your audience to understand. When people raise objections, you should regard that as helpful, and ask their assistance in filling in the gaps. Either something wasn't clearly explained so they misunderstood it, in which case getting to the root of the misunderstanding to produce a better explanation will be helpful, or there is a logical gap in the explanation, which you need to fill in. Be open to the possibility - most arguments have holes, even in the best-developed scientific theories, let alone new developments hot off the press (which is what science journalists mostly deal with).

And if the science is simply too complicated to explain, then be open about that. The scientific position in the face of evidence you don't have access to or understand should always be "I don't know." And as one scientist put it: "No scientist who wishes to maintain respect in the community should ever endorse any statement unless they have examined the issue fully themselves." The same goes for laymen who want to be scientific about it. You can believe what you like, for any reason you like, but you should only regard such beliefs as "scientific" if you understand the best evidence for and against them yourself. It's acceptable to talk about the many people who have checked and challenged complicated scientific theories in lieu of doing it yourself, and to present the evidence that those checks would probably have found any flaws had there been any, but you should always be open not only about the assumptions, uncertainties, and holes in the science, but also the assumptions, uncertainties and holes in your presentation of it.

Now considering issues from cultural cognition specifically...

The most important thing you can do is to listen to your 'opponents' and understand what your scientific conclusions mean to them culturally. For example, environmental issues might trigger responses due to one side's philosophical belief in industrial man poisoning the natural world, and the other side's philosophical belief in environmentalists always opposing technological progress on scientifically invalid grounds. They're both worried about wrong decisions being made on the basis of invalid or insufficient evidence.

Once you understand the cultural implications, it's usually fairly easy to demonstrate your respect for their concerns by explicitly addressing both sides of the argument, and to head off any unnecessary cultural objections by pointing out why they don't have the adverse consequences they're worried about before they occur.

So if you're a chemical company talking to environmentalists, you would start off describing a list of processes and chemicals you decided not to use on environmental grounds, and the sort of tests and evidence you used to come to that decision. When you're mutually agreed on what sort of standards of evidence to use while discussing a culturally 'safe' example in which the environment was protected, you can then go on to present the evidence for the environmental safety of your new process.

Conversely, if you're an environmentalist talking to a chemical company, you would start off by talking about all the industrial processes that you have previously endorsed as sufficiently safe, and what sort of evidence you accepted for their safety, and then discuss whether such evidence can be provided for their proposals.

In both cases, the presenter demonstrates that they understand the cultural concerns of their audience, and gives evidence that the presenter will respect them. In both cases, the question is what sort of evidence is acceptable to both sides, it's not about "who is right". In both cases the presenter listens to their audience, and treats it as a cooperative exercise rather than a confrontation.

Before you make a large-scale presentation, try the idea out on a few people from the ideological position you're expecting to object. What do they say? What are their concerns? Ask for their suggestions on how to present it, and what issues you need to address. And be sure to find out not only what their concerns are about the scientific argument itself, but what their concerns are for how it will be used in the political and cultural debate.

Then for each concern, consider whether it's not a problem (it doesn't actually have that worrying implication), it's fixable (there's an easy alternative that avoids the problem), or it *is* an actual problem because it directly contradicts their beliefs, in which case you have to go deeper and find out why they hold those beliefs and on the basis of what evidence. The cultural effects mean that you will also have to address this other evidence/argument in your presentation as well as the thing you wanted to talk about. Make sure people have a way to accept your scientific results without having to abandon their cultural beliefs and values, if at all possible.

And when you're done, do remember that it's not your aim to persuade people to a particular belief about the science - it's to make sure they understand the evidence for the arguments for and against, and are able to use it in making their judgement. Too many scientists are advocates for the conclusion they themselves believe in (whether for scientific or cultural reasons - scientists are by no means immune) and get upset if they make their best presentation and their audience is not persuaded. But that's not the aim of science journalism. The aim is to provide people with the information to make their own judgement, in the context of their own beliefs and knowledge. If they judge differently, that's their right.

The scientific method needs a diversity of viewpoints, from people who disagree with the 'obvious' interpretation, to provide a proper sceptical test. It's therefore highly unscientific to be annoyed by the simple fact of people disagreeing. The only thing that should be annoying is when people believe in any position without understanding the evidence for it.

January 2, 2017 | Unregistered CommenterNiV

I think that our society gets too bogged down sometimes in the idea that there are "two sides". This is different than taking a broad based approach in which there are a number of possible outcomes using the best available science. I think that it is important to recognize that the pursuit of scientific knowledge and the collection of socio-economic activities that any society engages in that are based on "Scientific Progress" are separate things. And that societies can make deliberated decisions as to what direction to proceed in. Or not to proceed at all.

This is linked to the fact that scientists usually can't devote resources to investigating just anything. Individual scientists have to be open to creating the sorts of research they are interested in by piecing together travel, employment opportunities and funding sources such as grants. Scientists necessarily have to be able to point out the importance of planned work, that is how funding is obtained. Albert Einstein played a great role in advocating for what became the Manhatten Project. Monies for supercolliders or cancer research doesn't just fall out of the sky. There can also be differences between who does great science and who is good at, or have access to the means for, publicity. We remember Charles Darwin for the theory of evolution, not Alfred Wallace. Much current research comes out of industrial laboratories, and that work is specifically driven towards corporate goals.

Furthermore, there are powerful forces that are engaged in actively attempting to dominate conversations regarding science to serve their own goals. I think that this is why it is very important for science communicators to understand their audience, how to actually address the concerns of that audience, and to be very aware of how others might want to turn your communication venue or event into one that promotes their ideas instead. This is why it is very important to avoid jousting. That allows the conversation to be dominated by extremes, and can serve to force the science communicator to frame their responses in a manner that serves those extremes. It helps to do a little background research to see what it is likely that the audience may have heard elsewhere and what techniques opponents of that science may use to derail a science based conversation. Public audiences are usually very amenable to the idea that no one person should hog the questioning time. A few queries for local information from the audience often will elicit others in the audience to step forward.

I think that the public ought to be given a great deal more credit for attempting to act in a science-y manner even when doing so with tremendous ignorance.

For example, I think that the current "gluten free" craze is a good case study. A very few people actually have celiac disease, and really do need to avoid gluten. These people and their families first used the Internet to exchange recipe ideas. Some realized that they were good enough at baking to make a cottage industry out of it. Usually, these people used less processed ingredients, whole grains, fruits and vegetables. And other friends and family members, and soon a growing circle of customers liked the products and felt eating them made them feel better. Which might have actually been true, if in so doing they were eating less processed, lower sugar products. But then Big Food caught wind of a potential marketing concept and subverted it. Chemists were set to work in food labs taking things like tapioca starch and using them to fashion heavily processed foods that were exactly like the stuff people shouldn't have been eating in the first place. As a chemist, I shouldn't be against jobs programs for chemists, but really this is a bad idea. Now, labeling things "gluten free" is a ploy that allows people to think that something unhealthy is healthy. Sort of like putting filters on cigarettes. In my opinion, it is wrong (as well as ineffectual) to then castigate these people for being stupid. They are in fact, being deliberately led astray.

Other arguments are substitutes for issues that are actually more complex. Matters of policy are frequently best addressed with "hooks" that link to existing legal frameworks. I think it is also important to recognize that when dealing with government policies and regulation, it is much easier to have an effect on things yet to be implemented than those already in place. For example, regulation and food ingredient approval in the US started with new ingredients, with those already in general use recognized as "GRAS", generally recognized as safe. Governmental Agencies are not funded at a level that would allow a massive ramp up to test "everything". There has to be a starting point. This does actually mean that there may be new proposed food ingredients banned that are no more unsafe than older ones not yet recognized as such. The food industry often acts as if this means that regulation is wrong. In my opinion, it just means that good science would dictate a need to circle back and test long used products for the sorts of more subtle than outright death by poisoning that might have made them seem "edible" in days of yore.

In my opinion, this sort of ramp up is also perfectly reasonable for food origin labeling efforts, including GMO labeling. In this modern day of Big Data, Big Retailers, Big Food and Big Ag all have substantial databases that track foodstuffs and commodity pricing world wide. The potential of such information can be seen every time there is a food safety incident. Private control of this information gives these corporations a big advantage over consumers, laborers and farmers. Information is power. The role of science ought to be in the direction of increasing transparency. Labels that simply say "No GMOs" are of limited value. But not wrong. Big Food also puts up quite a fight about food ingredients the disclosure of which they claim to be irrelevant. Thus, in my opinion, rather than opposing any labeling and appearing to look as if there is something to hide, scientists ought to be pushing for a sequence of staged disclosure requirements that lead to much more transparency regarding food origins overall.

Just recently, while at the supermarket, a woman contemplating various lunchbox sized fruit juice drinks (Do I have "Ask a Chemist" tattooed on my forehead?). She wanted to know if she should get the ones labeled "No Sugar", "No Sugar Added", "Organic", Non GMO, or "Gluten Free". I suggested that we look at the labels and first see which ones contained actual fruit juice. The issue here, as I see it is not that the woman was stupid, she's just trying to feed her kids healthy stuff. And ought to be recognized and encouraged for such attempts. With some educational push towards the idea that her kids would be better off with a real apple or orange and a glass of real water. The issue is the deliberate intention to deceive as a marketing ploy on the part of Big Food.

People in places that still have small scale agriculture often oppose GMOs because they are trying to prevent the very real problems that current GMOs have caused in fostering a large scale corporate commodity based agricultural system. And I think that we need to recognize that blocking GMOs in general, in the face of the fact that the actual GMOs that might be introduced are a real threat to small scale agriculture, is actually an effective political position. A tactical position that doesn't really say much about the science of genomics. If one were in, say, India, with a long term view towards improving the lot of people there, and observed the way in which cotton growing in the Texas panhandle was depleting the Ogallala aquifer, keeping out any large scale expansion of cotton growning might seem reasonable. In my opinion, the original introduction by Syngenta, of Golden Rice came simultaniously with efforts towards large scale corporate farming that displaces agricultural workers and smaller more poorly performing farms in favor of modern equipment and scale. Adding vitamin A to rice is like enriching white flour, effective to an extent. But in my opinion, we should not be shocked if in telling someone effectively, here, eat this, at least your kids won't go blind, the answer is !!!**##!! Vitamin A alone will not save the children, for life as anything other than a serf. This is not about the science of genomics, it is more about the socio-economic displacment of the worker with the hoe with a tractor, and perhaps sending families to go live in slums in the city. The strains of modernization.

Science journalists have to make a living too. So personal decisions have to be made as to how to proceed in the modern economy. Jousting does actually work, and can make one well known. Many media outlets specifically tailor their content to a specific political alignment of readers. Some science journalists may be able to serve as pure "explainers" and keep away from policy conflicts. Others may work in investigative journalism. Still others are in positions more like publicity agents. In the case of GMOs for example, I think that public acceptance might have been quite different if the new methods had been clearly explained as extensions of previous knowledge, if people had been more aware of other less specific methods, such as radiation induced mutation that were already in use, and if greater effort had been made by scientists to achieve public funding of research into new traits, and stringent as well as transparent regulation of new GMOs. Going forward, new technologies like CRISPR offer both great promise and serious perils. Science journalism has an essential role to play.


Good luck navigating your own way!

January 3, 2017 | Unregistered CommenterGaythia Weis

"and using them to fashion heavily processed foods that were exactly like the stuff people shouldn't have been eating in the first place"

As a chemist, can you explain to me what "heavily processed" actually means? What's the chemical definition?

"This does actually mean that there may be new proposed food ingredients banned that are no more unsafe than older ones not yet recognized as such. The food industry often acts as if this means that regulation is wrong. In my opinion, it just means that good science would dictate a need to circle back and test long used products for the sorts of more subtle than outright death by poisoning that might have made them seem "edible" in days of yore."

Indeed!

"Just recently, while at the supermarket, a woman contemplating various lunchbox sized fruit juice drinks (Do I have "Ask a Chemist" tattooed on my forehead?). She wanted to know if she should get the ones labeled "No Sugar", "No Sugar Added", "Organic", Non GMO, or "Gluten Free""

Tell her she wants the one labelled: "Contains the carcinogenic pesticide and paint-stripper 1-Methyl-4-(1-methylethenyl)-cyclohexene".

:-)

"Big Food also puts up quite a fight about food ingredients the disclosure of which they claim to be irrelevant. Thus, in my opinion, rather than opposing any labeling and appearing to look as if there is something to hide..."

I'd have said they need to start a conversation about *why* they think they're irrelevant. What's their argument/evidence? That serves the dual purpose of educating the public about science and helping people to make the right decision (for them) for the right reasons.

"People in places that still have small scale agriculture often oppose GMOs because they are trying to prevent the very real problems that current GMOs have caused in fostering a large scale corporate commodity based agricultural system."

Yes, it's ironic really, since it's the heavy regulation that drives out the competition and keeps GMO the preserve of big business. Nobody else has deep enough pockets to get any product to market past the testers and protestors. If you want GMO to help the small scale traders and the poor, you have to make it cheaper to develop and sell. But that requires an understanding of economics, which is as badly taught as science is. Understanding that regulation makes things more expensive and therefore only accessible to the rich is as basic to economics as knowing the Earth goes around the sun is to science. But you can't blame people for not knowing if they've never been told why, though.

"In the case of GMOs for example, I think that public acceptance might have been quite different if the new methods had been clearly explained as extensions of previous knowledge, if people had been more aware of other less specific methods, such as radiation induced mutation that were already in use"

Quite so. The scientists took the attitude of saying "Take our word for it, it's perfectly safe!" without explaining how they knew, or what "safe" meant; compared to what else. *No* foods are perfectly 100% "safe", as the annual toll from food poisoning demonstrates. They used a fudged and over-simplified definition, provided no evidence, and acted as if the public were idiots incapable of understanding for themselves who had to be told what to do. It's patronising. People not already disposed to "trust the experts" rebelled, and asked "why should we believe you?"

The questions should be: is it safer than the alternatives? What tests do we need to do? What threshold do we set? And then if our standard is truly justified. scientifically, do we not have to apply it consistently? What would happen if we applied those same standards to all-natural foods like orange juice, peanuts, and cabbage? Is that really what we meant?

Agree a standard of evidence first. Then whether things pass or fail is a purely factual, scientific matter; one we can all agree on.

January 3, 2017 | Unregistered CommenterNiV

Dan, I think this article that you posted recently about the geneticist making his communication "expensive" is one of the more enlightening pieces on practical communications I've ever read.

It builds on a scientist's natural "just present the facts" impulse with a successful example of how one can effectively present facts inside an escalated environment.

I offer my own journalistic reaction on that article, which is that in writing any piece, it is always tempting to advocate for or against your subject matter; but what matters most is to be the advocate for your reader, or for your audience. It is after all for their benefit that you write.

January 8, 2017 | Unregistered Commenterdypoon

PostPost a New Comment

Enter your information below to add a new comment.

My response is on my own website »
Author Email (optional):
Author URL (optional):
Post:
 
Some HTML allowed: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <code> <em> <i> <strike> <strong>