follow CCP

Recent blog entries
popular papers

Science Curiosity and Political Information Processing

What Is the "Science of Science Communication"?

Climate-Science Communication and the Measurement Problem

Ideology, Motivated Cognition, and Cognitive Reflection: An Experimental Study

'Ideology' or 'Situation Sense'? An Experimental Investigation of Motivated Reasoning and Professional Judgment

A Risky Science Communication Environment for Vaccines

Motivated Numeracy and Enlightened Self-Government

Making Climate Science Communication Evidence-based—All the Way Down 

Neutral Principles, Motivated Cognition, and Some Problems for Constitutional Law 

Cultural Cognition of Scientific Consensus
 

The Tragedy of the Risk-Perception Commons: Science Literacy and Climate Change

"They Saw a Protest": Cognitive Illiberalism and the Speech-Conduct Distinction 

Geoengineering and the Science Communication Environment: a Cross-Cultural Experiment

Fixing the Communications Failure

Why We Are Poles Apart on Climate Change

The Cognitively Illiberal State 

Who Fears the HPV Vaccine, Who Doesn't, and Why? An Experimental Study

Cultural Cognition of the Risks and Benefits of Nanotechnology

Whose Eyes Are You Going to Believe? An Empirical Examination of Scott v. Harris

Cultural Cognition and Public Policy

Culture, Cognition, and Consent: Who Perceives What, and Why, in "Acquaintance Rape" Cases

Culture and Identity-Protective Cognition: Explaining the White Male Effect

Fear of Democracy: A Cultural Evaluation of Sunstein on Risk

Cultural Cognition as a Conception of the Cultural Theory of Risk

« Grading the 2015 version of Pew study of public attitudes toward science | Main | There is pervasive cultural consensus on the value of childhood vaccines in the U.S.; so why do people *think* that being anti-vaccine reflects any particular cultural predisposition? »
Thursday
Jan292015

Science of Science Communication 2.0, Session 3.1: Science comprehension: who knows what about what—and how?

Okay, this is "session 3" post for virtual "Science of Science Communication 2.0."  The real space version got snowed out!  

But the discussion I was hoping for (based on these readings) was mainly one on how group affinities contribute to transmission of scientific knowledge.  

"Cultural cognition" is normally associated with idea that such affinities distort such transmission.  

But my hunch is that cultural cognition is in fact integral to citizens' reliable apprehension of what is known to science; that the sorts of pathologies that we see in which people with different cultural identities use their reason to form and persist in opposing views on risks and related facts is a consequence of a polluted science communication environment that disables the normally reliable reasoning strategies people use (including observation of what others who know what's what about what are doing and saying) to figure out what is known...

But I admit to being uncertain about this!  Indeed, I readily admit to being uncertain about everything, including the things I am most confident that I think I understand; certainly I am committed (I hope able) to revise anything I blelieve on the basis of any valid evidence I encounter.

But here I am not even as confident as I'd like to be about the state of the evidence on my conjecture -- that cultural cognition is not a bias, but is integral to the normal process by which diverse people usually converge on the best evidence.  And so I was & remain eager for reflections by others!

Below are the questions I posed to motivate student reading & orient discussion for this session.  Next session, in which we'll be doing "double time" to make up for lost class, will feature trust in/of science...

  1. What is the relationship between the sort of critical reasoning proficiency featured by Baron’s “actively open-minded thinking” and Dewey’s understanding of “scientific thinking”?
  2.  

  3. Is critical reasoning proficiency essential for science comprehension on the part of a non-scientist, either in her capacity as personal decisionmaker, member of civil society, or democratic citizen?
  4.  

  5. Are conflicts over policy-relevant science plausibly attributable to deficits in critical reasoning proficiency?
  6.  

  7. Does the effective use of scientific knowledge by non-scientists—in the various capacities in which their decisions should (by their own lights) be informed by it—depend on their being able to comprehend what it is that science knows?
  8.  

  9. Is it possible for citizens to reliably recognize who knows what is known by science without being able to comprehend what it is those individuals know?  If so, how? Does their ability to do that depend on their possessing the sort of reasoning proficiency emphasized by either Baron or Dewey? If not, what does it depend on?
  10.  

  11. How does Popper’s understanding of the transmission of scientific knowledge relate to Miller’s, Dewey’s, and Baron’s?
  12.  

  13. Do group affinities—ones founded on common outlooks and values—promote transmission of scientific knowledge or inhibit it?  In either case, how?

PrintView Printer Friendly Version

EmailEmail Article to Friend

Reader Comments (5)

Multi-level selection evolutionary theory has quite a bit of explanatory power when it comes to answering your questions.

As the evolutionary biologist David Sloan Wilson explains, there are two types of human "truths" or "realities":

1) Factual realities: These are the truths or realities which are factual and objective in nature, and which scientists should seek to discover.

2) Practical realities: These are the realities or truths which constitute the glue which holds social groups together, and are spun by political, economic and social entrepreneurs.

Practical realities do not have to be factually true to function. As Wilson explains in Darwin's Cathedral: "Even massively fictitious beliefs can be adaptive, as long as they motivate behaviors that are adaptive in the real world."

Practical realities or truths can include belief systems such as 1) traditional religions, 2) "secular stealth religions" like the religion of progress and its various sub-denominations -- its socialist, Marxist, fascist, American progressive, capitalist and other branches, and 3) national mythologies, such as the "political theology" of Thomas Jefferson, as Thomas E. Buckley called it:

Far from being a twentieth-century-style secularist or advocating a national polity indifferent to religion, Jefferson publicly expressed what became the American faith -- a complex of ideas, values, and symbols related to and dependent on a transcendent reality we call God. This civil religion interpreted the historical experience of the American people, validated their republican arrangements, and shaped the political culture that united the citizens of the new republic.

Nietzsche was perhaps the first to explore this massively fictitious world which humans inhabit. Here's how George A. Morgan put it in What Nietzsche Means:

Man, indeed, has been an incorrigible pragmatist, like other animals, ever maintaining the truth of those beliefs which seemed to help him live. Because of the age-long selective process, surviving modes of interpretation probably do stand in some favorable relation to real conditions -- just favorable enough for survival. "We are 'knowing' [erkennend] to the extent that we can justify our needs." That truth is always best for life, however, is a moral prejudice. Falsification has been shown to be essential; truth is often ruinous, and sheer illusion helpful, as experience testifies. And of course there is no certainty about even the pragmatic value of our beliefs; there is merely the fact that we have survived so far. Beliefs not immediately harmful may yet be fatal in the long run.

This last point is the cause of the existential crisis which we face now. As John Gray observes of the popular myth-maker Naomi Klein:

Throughout This Changes Everything, Klein describes the climate crisis as a confrontation between capitalism and the planet. It would be more accurate to describe the crisis as a clash between the expanding demands of humankind and a finite world, but however the conflict is framed there can be no doubt who the winner will be. The Earth is vastly older and stronger than the human animal.... Rather than denying this irreversible shift, we’d be better off trying to find ways of living with it.

January 30, 2015 | Unregistered CommenterGlenn Stehle

"Throughout This Changes Everything, Klein describes the climate crisis as a confrontation between capitalism and the planet. It would be more accurate to describe the crisis as a clash between the expanding demands of humankind and a finite world, but however the conflict is framed there can be no doubt who the winner will be."

It would be more accurate to describe it as the triumph of the expanding demands of humankind being met by humankind's expanding ability to create the new resources to supply it. Men are not jayhawks, and the supply of chickens is not finite.

The clash, if you want to call it that, has mainly been between the predictions made in the litany of doom, and the eventual outcome. Between the neo-Malthusians, who constantly declare that 'The end of the world is nigh' with tedious regularity, and the subsequent reality in which we are all still here to listen to them saying it yet again. Reality always wins, but the believers never give up believing.

Klein, of course, is more concerned about capitalism than the planet. Her solutions wouldn't save the planet even on their own accounting, but they'd do much to damage capitalism and all the other means we use to meet our expanding demands. However, crisis is an essential element of her solution - people will let you get away with more in a crisis - so creating the resource crisis she predicted would be no bad thing, from her point of view. Continual crisis is continual justification for the ever harsher measures needed to deal with it. What's most odd about it is that it seems to work - people don't seem to learn from the past.

January 30, 2015 | Unregistered CommenterNiV

My initial reaction to the concept of a polluted science communication environment was to assume that the environment must always be polluted. If we know that people understand what is communicated through the filter of their own values, and those values are defined (often, and in practice) through their group membership and in opposition to other groups, I'm tempted to say that the pollution of the environment is virtually a self-fulfilling prophecy. In practical terms, once something is widely known to be true, there's an incentive for outliers to start defining themselves in opposition to that position. See, i.e., anti-fluoridation activists who are more concerned about being told what's true by men in lab coats than with the chemistry.

Upon reflection, though, that may be either false or only trivially true. The risk of cigarettes, for example, is widely accepted (I think) even though it seems the same people would have an equally strong incentive to reject the communication of that science.

The difference may be the individual risks and benefits. I'm thinking a lot about Caplan's Rational Ignorance/Rational Irrationality work, so my thoughts are filtered through that lens, but it would explain the difference between the pollution of the communication environment for risks like fluoridation and that of the communication of the risks of smoking: one is much more certain, immediate, and personal than the other. We'd expect the communication of the more personally-significant risks to be cleaner, and I think that's what we see in practice.

I think in terms of your questions, comprehension of what scientists know only becomes relevant to most people when the perceived cost of error is significant. If understanding the science doesn't matter--if there's no perceptible personal harm to unnecessarily removing fluoride from the water--then interested parties can gin up their own version of "what scientists know." We see this in just about every contentious field, as scientists (or credentialed individuals) branch out to support fringe beliefs. See Dr. Sears, for example.

I think this proliferation of convenient scientific knowledge is significantly enhanced by group identity, since the groups recruit their enabling scientists and disseminate their perspectives. And that's not necessarily a bad thing, since those minority perspectives might actually be right.

I suspect but can't prove that as the perceived personal cost of error (the harmful consequences for someone who's wrong about a scientific belief) go up, the schisms between groups start to close and you wind up with fewer cultural identities. Lots of people think of themselves as "pro-vaccine" or "anti-vaccine," but very few think of themselves as "pro-cigarette". I also suspect that as a consequence, the average knowledge of the science on vaccines is greater than the average knowledge of the science on cigarettes at any level beyond "cigarettes cause cancer."

I may be succumbing to availability bias, though, as I know many people who are well-informed on vaccines due to being caught up in the vaccination debate. I doubt they'd be as well-informed without the debate, or that the debate would have reached them without "group affinities."

January 31, 2015 | Unregistered CommenterColin

Another take on the same basic thought:

You've argued very persuasively that cultural cognition is not a bias. Would it be more accurate to say that cultural cognition is not necessarily a bias? It should only contribute to our ability to determine what is true if we can accurately determine who knows what is true.

Our ability to discriminate between people who have accurate knowledge and those who don't is probably only of secondary importance to the vast majority of people. We care more about whether the proposition on the table fulfills our interests. If I desperately want to feel in control of my life and health, I'll overvalue the knowledge of Dr. Oz's latest guest and thus the efficacy of his essential oil nostrums. The benefit to me is immediate and significant, and the cost of being wrong is relatively small as long as I'm not very sick. On the other hand, if I feel a lump I'll have to reassess my evaluation of that same guest's knowledge, because the cost of error has increased.

I'd argue that cultural cognition is a reliable way of achieving our interests. When those interests include accurate scientific knowledge, which is typically only the case when the cost of error is significant, it's not a bias. In all other circumstances, it can enable irrational thinking.

January 31, 2015 | Unregistered CommenterColin

@ Colin

I don't see it so much as "the perceived personal cost of error," but the perceived personal cost of being right.

Take the energy debate, for instance.

On the cultural "right" we hear a constant drumbeat. It seeks to minimize the future cost of global warming. It also seeks to minimize the future cost of extracting ever-depleting (and costly) carbon energy resources.

On the cultural "left" we hear a different drumbeat. It seeks to minimize the future cost of "renewable and sustainable energy" (Talk about an oxymoron!)

Both sides have their own version of Alice in Wonderland, playing down any realistic assessment of the future cost of their pet energy supply.

Denial, avoidance, procrastination, wishful thinking and mental laziness are the order of the day on both ends of the cultural divide when it comes to avoiding the realities of the energy issue.

Maybe this is because of lack of cogency, concreteness and clarity of events, or the perceived applicability to self. But the fact remains that both ends of the cultural divide are deep into denial, and are a long, long way from working through the cognitive, emotional and moral obstacles which the energy conundrum presents them.

February 1, 2015 | Unregistered CommenterGlenn Stehle

PostPost a New Comment

Enter your information below to add a new comment.

My response is on my own website »
Author Email (optional):
Author URL (optional):
Post:
 
Some HTML allowed: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <code> <em> <i> <strike> <strong>