follow CCP

Recent blog entries
popular papers

Science Curiosity and Political Information Processing

What Is the "Science of Science Communication"?

Climate-Science Communication and the Measurement Problem

Ideology, Motivated Cognition, and Cognitive Reflection: An Experimental Study

'Ideology' or 'Situation Sense'? An Experimental Investigation of Motivated Reasoning and Professional Judgment

A Risky Science Communication Environment for Vaccines

Motivated Numeracy and Enlightened Self-Government

Ideology, Motivated Cognition, and Cognitive Reflection: An Experimental Study

Making Climate Science Communication Evidence-based—All the Way Down 

Neutral Principles, Motivated Cognition, and Some Problems for Constitutional Law 

Cultural Cognition of Scientific Consensus
 

The Tragedy of the Risk-Perception Commons: Science Literacy and Climate Change

"They Saw a Protest": Cognitive Illiberalism and the Speech-Conduct Distinction 

Geoengineering and the Science Communication Environment: a Cross-Cultural Experiment

Fixing the Communications Failure

Why We Are Poles Apart on Climate Change

The Cognitively Illiberal State 

Who Fears the HPV Vaccine, Who Doesn't, and Why? An Experimental Study

Cultural Cognition of the Risks and Benefits of Nanotechnology

Whose Eyes Are You Going to Believe? An Empirical Examination of Scott v. Harris

Cultural Cognition and Public Policy

Culture, Cognition, and Consent: Who Perceives What, and Why, in "Acquaintance Rape" Cases

Culture and Identity-Protective Cognition: Explaining the White Male Effect

Fear of Democracy: A Cultural Evaluation of Sunstein on Risk

Cultural Cognition as a Conception of the Cultural Theory of Risk

Friday
Jun022017

It's out! Motivated Numeracy & Enlightened Self-government hits newstands

Get your personal copy today!

& while at it, check out rest of the cool papers in this maiden issue of Behavioural Public Policy.

 

Thursday
Jun012017

Quick, place order for Oxford Handbook of Science of Science Communication now!

Available June 6. Pre-order yours to avoid onslaught of eager purchasers.

Great stocking stuffer! 

Tuesday
May302017

Generalized trust in science: nukes vs. climate

Here’s another helping of “trust in science” data, this time compliments of Pew Research Center.

Yesterday's data“Yesterday”tm  I posted some data from the General Social Survey that showed that liberals but not conservatives become more inclined to worry about climate change as their trust in science increases. That’s a pattern that goes against one popular narrative, which attributes climate skepticism to an anti-science disposition on the part of right-leaning individuals.

But at least one commentator was understandably dissatisfied with the GSS outcome variable, which solicits on a five-point scale respondents’ assessments of how dangerous they  “think that a rise in the world's temperature caused by the `greenhouse effect', is for the environment.” As the commenter pointed out, this item presupposes that climate change is occurring, a proposition rejected by around 25% of the U.S. population.

Well, the Pew data (which, like the GSS, can download for free) contains a more conventional “belief in climate change” measure in which respondents can indicate whether there believe there is “solid evidence that the average temperature on earth has been getting warmer over the past few decades,” and if so, whether that change is “mostly because of human activity such as burning fossil fuels or mostly because of natural patterns in the earth’s environment?”

When one regresses on belief in human caused climate change on respondents’ political outlooks, their score on a trust in science scale, and the interaction of those variables, one sees pretty much the same pattern as in the GSS data.  That is, it looks like more left-leaning respondents are influenced by their level of trust in science but more right-leaning ones aren’t (in fact, the difference in how much the trust scale affected respondents conditional on their political outlooks was “significant” at only p = 0.10, but if are interested in what the analysis adds to the weight of the evidence, we shouldn’t get too hung up on that; maybe “tomorrow”tm I’ll elaborate on that).

That we got the same answer from two different data sets, which used different measures, should make us more inclined to reject a generalized “science trust deficit” theory of why conservatives are more climate skeptical. (Conservatives might be more “distrustful” of climate scientists, but that response would best be viewed as just an indicator, not an explanation, of the latent disposition toward skepticism on climate change ([Poortinga & Pidgeon 2005]).

Another commenter wondered what would happen if we substituted nuclear power for climate change in the GSS item, and in particular how liberals would respond.

Well, the GSS dataset contains such an item. It asks respondents to indicate, on the same 5-point scale, how dangerous they regard “nuclear power stations” to be for the environment.

Here at least, we see the “science trust deficit” doing what it is often advertised as doing—namely, predicting less concern among individuals as their trust level (measured as it was “yesterday”tm) increases.  It does so, moreover, for both liberals and conservatives:

 

What to make of this?  Well, you can tell me.

But I would say that, for myself, I was a bit surprised.  I was expecting to discover  that high levels of trust in science in general have no impact on disputed applications of particular forms of decision-relevant science.  Because more trust was associated with less concern for both liberals and conservatives, I have less confidence than I did that a generalized trust measure is of little use.

BTW, you can find the 7 items I used to form the Pew “trust in science” scale here.  The dataset had more candidate trust items in it, but this combination of items displayed the highest reliability score (α = 0.69).

Reference

Poortinga, W. & Pidgeon, N.F. Trust in Risk Regulation: Cause or Consequence of the Acceptability of GM Food? Risk Analysis 25, 199-209 (2005).

 

 

 

Friday
May262017

Do conservatives become more concerned with climate risks as their trust in science increases?

It is almost universally assumed that political polarization over societal risks like climate change originate in different levels of trust in scientists: left-leaning people believe in human-caused climate change, it is said, because they have a greater degree of confidence in scientists; so-called “conservative Republicans," in contrast, are said distrust of science and scientists and thus are predisposed to climate skepticism.

But is this right? Or are we looking at another form of the dreaded WEKS disease?

Well, here’s a simple test based on GSS data.

Using the 2010 & 2016 datasets (the only years in which the survey included the climate-risk outcome variable), I cobbled together a decent “trust in science” scale:

scibnfts5: “People have frequently noted that scientific research has produced benefits and harmful results. Would you say that, on balance, the benefits of scientific research have outweighed the harmful results, or have the harmful results of scientific research been greater than its benefits?” [5 pt: strongly in favor beneficial . . .strongly in favor of harmful results.”)

consci: “As far as the people running [the science community] are concerned, would you say you have a great deal of confidence, only some confidence, or hardly any confidence at all in them,”

scientgo: “Scientific researchers are dedicated people who work for the good of humanity.” [4 points: strongly agree . . . strongly disagree)

scienthe: “Scientists are helping to solve challenging problems.” [4 points: strongly agree . . . strongly disagree)

nextgen: “Because of science and technology, there will be more opportunities for the next generation” [4 points: strongly agree . . . strongly disagree”]

advfont.  “Even if it brings no immediate benefits, scientific research that advances the frontiers of knowledge is necessary and should be supported by the federal government.” [4 points: strongly agree . . . strongly disagree”]

scientbe. “Most scientists want to work on things that will make life better for the average person.” [4 points: strongly agree . . . strongly disagree”]

These items formed a single factor and had a Cronbach’s α score of 0.72.  Not bad. I also reverse coded as necessary so that for every item a higher score would denote more rather than less trust of science.

Surprisingly, the GSS has never had a particularly good set of climate-change “belief” and risk perception items. Nevertheless, they have sometimes fielded this question: 

TEMPGEN: “In general, do you think that a rise in the world's temperature caused by the `greenhouse effect', is exptremely dangers for the evironment . . . not dangerous at all for the environment?” [5 points: “exptremely dangers for the evironment . . . not dangerous at all for the environment?”]

I don’t love this item but it is a cousin of the revered Industrial Strength Risk Perception Measure, so I decided I’d give it a whirl. 

I then did some regressions (after of course, eyeballing the raw data).

In the first model, I regressed a reverse-coded TEMPGEN on the science-trust scale and “left_right,” a composite political outlook scale formed by aggregating the study participants’ self- (α= 0.66 ).  As expected, higher scores on the science-trust scale predicted responses of “very dangerous” and “extremely dangers,” while left_right predicted responses of “not very dangerous” and “not dangerous at all.”

If one stops there, the result is an affirmation of  the common wisdom.  Both political outlooks and trust in science have the signs one would expect, and if one were to add their coefficients, one could make claims about how much more likely relatively conservative respondents would be to see greater risk if only they could be made to trust science more.

But this form of analysis is incomplete.  In particular, it assumes that the contribution trust in science and left_right make to perceptions of the danger of climate change are (once their covariance is partialed out) independent and linear and hence additive.

But why assume that trust in science has the same effect regardless of respondents’ ideologies? After all, we know that science comprehension’s impact on perceived climate-change risks varies in relation to ideology, magnifying polarization.  Shouldn’t we at least check to see if there is a comparable  interaction between political outlooks and trust?

So I created a cross-product interaction term and added it to form another regression model.  And sure enough, there was an interaction, one predicting in particular that we ought to expect even more partisan polarization as right- and left-leaning individuals' scores on the trust-in-science scale increased.

Here’s what the interaction looks like:


Geez!  Higher trust promotes greater risk concern for left-leaning respondents but has essentially no effect whatsoever on right-leaning ones.

What to say?...

Well one possibility that occurs to me is based on biased perceptions of scientific consensus.  Experimental data suggest that ordinary persons of diverse outlooks are more likely to notice, assign significance to, and recall instances in which a scientist  took the position consistent with their cultural group's than ones in which a scientist took the opposing position.  As a result, people end up with mental inventories of expert opinion skewed toward the position that predominates in their group. If that's how they perceive the weight of expert opinion, why would they distrust scientists?

But I dunno. This is just post hoc speculation.

Tell me what you think the answer is – and better still, how one could design an experiment to test your favored conjecture against whatever you think the second most likely answer is.

Wednesday
May242017

New paper: Misperceptions, Misinformation & the Logic of Identity Protective Cognition

Paper in draft; comments welcome!


Tuesday
May232017

Asymmetry thesis--now we're going to need a meta-meta-analysis

Check out the dueling meta-analyses of "asymmetry thesis" studies!

I'll tell you what I think "tomorrow"™, but in meantime, why not tell me what you think "today"™?




Monday
May082017

Are Republicans and Democrats more divided on or each more supportive of federal spending on science? Both, according to Pew Research Center

As the 14 billion readers of this blog are aware, I’ve been culling science-attitude data from the GSS for the last few weeks.  The gist of it is that there’s not a whole lot of difference between the views of politically diverse citizens.

Displaying impeccable timing, on May 1, Pew Research Center released some interesting  data (as their data always are) on support for “increased” federal spending on science that seems to contravene that conclusion.  Under the headline “Democrats far more supportive than Republicans of federal spending for scientific research,” they report a “wide and growing partisan gap . . . over how much government should spend for scientific research.”

The question has a counterpart in the GSS.  While the most recent GSS data is 2016, in the period in which the two surveys—GSS’s and Pew’s—overlap, the former has always suggested much less of a gap in partisan views.



This goes to show how much subtle language differences can make in respose to survey items and cautions against relying overmuch on any single measure when trying to assess attitudes. The better approach is to explore larger groups of items that get at the same thing & see if they form a scale, at which point covariances can supply a more reliable yardstick of who feels what way and why.

There are two more interesting things (at least) about Pew’s data.

One is that both Republicans and Democrats supported spending more and opposed spending less for science in the Pew 2017  data relative to their positions in the last 8 yrs. (the only period for which Pew has reported data for both responses). If there were reason to think these kinds of sentiments have any influence on Congress (there’s not much), this would be good news.

The other is that the widened gap between Republicans and Democrats is actually attributable not to a decline in the support of Republicans for more science funding—again, in Pew’s 2017 data, Republicans are more supportive than previously—but in the huge 14 pct point jump in Democrats who support more spending.  

Why Democrat support increased so dramatically merits more study in itself.

But in any case, relative to previous yrs, the Pew govt-spending data are consistent with the inference that there is more “pro-science” sentiment all  around in 2017 than previously. 

That’s pretty interesting. 

What do you think?

Thursday
May042017

Beware of reacting too fast to the "TOOFAST" item in the GSS

From something I'm working on . . .

c. Authority of science. Some GSS items are tailor-made for detecting conflict over the authority of science—our third science attitude. The one that has been asked the most consistently seeks respondents’ agreement or disagreement with the statement that “one trouble with science is that it makes our way of life change too fast” (TOOFAST). The volatile upticks and downticks in this item have been duly reported in alternately positive and negative ways in the NSF Indicators.  Thus, pointing to “a substantial drop” in affirmative responses in the 2012 GSS, the 2014 Indicators (p. 7-28) reported with evident relief that “fewer Americans said they were worried about the pace of change.”  Yet two years later, the NSF lamented that “Americans increasingly worry that science is making life ‘change too fast.’ ” “About half of Americans,” the Indicators advised, “expressed this view in 2014, up from about one-third in 2004” (2016, p. 7|4).

On closer inspection, though, there doesn’t seem to be anything about responses to TOOFAST—in whatever direction they move—that should arouse concern about the breadth of respect for the authority of science. A simple zero-order correlation, for example, confirms that at every level of this four-point agree-disagree item, study participants have positive expectations about the future benefits that science will confer on society (Figure 14). Indeed, respondents at every level of “TOOFAST” support the funding of science regardless of whether doing so confers “immediate benefits.”

“TOOFAST” might be measuring something.  But it is not measuring an attitude that reflects ambivalence toward the authority of science. 

Wednesday
May032017

"Where is everybody?" The missing "distrust of science" measures

From something I'm working on . . . .

4.1. “Where is everybody?” 

We adopted a critical stance in § 3 on existing measures of generalized science attitudes. We can think of two possible explanations for the absence of something more supportive of the view that general attitudes toward science are responsible for particular DRS [decision-relevant science] controversies. One is that  there just isn’t any substantial variation in the sorts of attitudes we have been describing, at least in the liberal democratic societies that feature public conflict over science issues. 

If a disposition is relatively uniform across the population, it won’t be possible, psychometrically, to form scales to measure it (Tinsley & Weiss 2000).  Items that admittedly do measure it won’t covary—because they won’t vary.  Accordingly, it will be impossible even to find items that one can be confident are measuring the disposition, much less find multiple ones to combine into a scale.

Is it plausible to think there is this degree of uniformity in “science attitudes” of the sort we identified in the second section? Looking around, we see very little evidence of any meaningful ambivalence toward the authority of science as a way of knowing.  Indeed, we suspect that most people in the US would be hard pressed at this point to even imagine what it would look like to live in a manner that didn’t treat science as authoritative over the kinds of matters to which it claims to speak. To be sure, there are grumblings about the performance of the institutions of science, but people—acting in their own capacity and through their democratically accountable agents—continue to support funding those responsible today for producing science. They do that because they think that the information is valuable for solving their problems: as we said, trust in science for decision making and trust of science institutions are linked.

But the question isn’t strictly how plausible it is that there is a uniformly high level of the various science attitudes we described in § 3.  It is instead how much  more plausible this conclusion is than the only other explanation we can think of for the absence of measures that detect meaningful levels of variance: that scholars of public attitudes toward science just haven’t realized that the “science attitude” measures” they are working with are inadequate, or have been too preoccupied answering related questions to identify better ones. 

We think that explanation is improbable.  There are too many smart and highly productive researchers in this field.

Enrico Fermi’s famous Bayesian “proof” against intelligent forms of extraterrestrial life (Gleiser 2016) applies at least as forcefully to the existence of meaningful forms of variance in dispositional trust in science, institutional trust of science, and acceptance of the authority of science: if sources of variance in these dispositions existed, someone would have found them by now.

Refs

Gleiser, M. The Simple Beauty of the Unexpected: A Natural Philosopher's Quest for Trout and the Meaning of Everything (University Press of New England, 2016).

Tinsley, H.E.A. & Brown, S.D. Handbook of Applied Multivariate Statistics and Mathematical Modeling (Elsevier Science, 2000).

 

Tuesday
May022017

More GSS data on "anti-science" phantom

Conservative citizens are less likely than liberal ones to believe that humans are causing global warming.

Religiously inclined citizens are less inclined to believe human beings evolved from another species of animal.

I get how one might hypothesize that these results are a consequence of an “anti-science” attitude on the part of individuals so defined.  Some more generalized ambivalence or even hostility to science and/or scientists on the part of these citizens, the argument goes, causes the more specific forms of nonacceptance of scientific evidence relating to these issues.

The problem is that when one looks in the places where one would expect to see the more generalized anti-science attitude, it ain’t there.

I’ve already described how both religious and conservative individuals have a high degree of “institutional confidence” in the “scientific community,” a standard General Social Survey item.

Well, if you look at the more specific “science attitude” items in the GSS, one sees the same thing: more religious citizens and more conservative ones both have pro-science attitudes.  

I pointed out a couple days ago that religious and conservative citizens, just like secular and liberal ones, credit science for making our lives better.

Now consider this:

One can always save the conservative & religious “anti-science” claim by simply treating skepticism about climate change and disbelief in human evolution as being anti-science.

But at that point the claim becomes a (boring) tautology.

Once one equates being anti-science with these positions, “they’re anti-science” is no longer an explanation for why religious and conservative citizens hold these positions—stances that are all the more peculiar once one sees that these citizens, like ones who do believe in climate change and evolution, have generally supportive attitudes toward scientists and scientific research.

Who believes what and why on these issues is an interesting question. But here as elsewhere “anti-science” is a mental roadblock to answering it in a scientific way.

Saturday
Apr292017

Oxford Handbook on Science of Science Communication: Preorder this now, before sells out!

 

At $160.00, this collection is actually much cheaper than most books of its genre. Plus it contains more insight.  How could you go wrong, then, in buying it or buying multiple copies even?

Friday
Apr282017

Nature Climate Change commentary: out of the lab & into the field

Tuesday
Apr252017

Are scientists unlikely to be religious persons?! One of the weirdest survey results I've ever seen

That 41% disagree really surprises  That a majority of the public (59%) disagrees with this item shocks me. I would have bet at most only 25% would disagree with this statement; I also would have predicted that religiously inclined people would be much more likely to agree with it.

Can someone explain--and in way that can be tested (i.e., no just-so stories that evade corroboration)?

Sunday
Apr232017

A token (or 2) of the Liberal Republic of Science

In honor of the march:

Saturday
Apr222017

Weekend update: a 10-yr reassessment of "expressive overdetermination"

From Kahan, D.M. The Cognitively Illiberal State. Stan. L. Rev. 60, 115-154 (2007). For sure, I still would define the problem this way. But I'm less sure the solution of "expressive overdetermination" makes sense. It's out of keeping, I think, with SE Fla. political climate science and with cognitive dualism. But maybe the point is that there are more solutions--or potential solutions--than just one...

Conclusion

 The nature of political conflict in our society is deeply paradoxical. Despite our unprecedented knowledge of the workings of the natural and social world, we remain bitterly divided over the dangers we face and the efficacy of policies for abating them.

The basis of our disagreement, moreover, is not differences in our material interests (that would make perfect sense) but divergences in our cultural worldviews. By virtue of the moderating effects of liberal market institutions, we no longer organize ourselves into sectarian factions for the purpose of imposing our opposing visions of the good on one another. Yet when we deliberate over how to secure our collective secular ends, we end up split along exactly those lines.

The explanation, I’ve argued, is the phenomenon of cultural cognition. Individual access to collective knowledge depends just as much today as it ever did on cultural cues. As a result, even as we become increasingly committed to confining law to attainment of goods accessible to persons of morally diverse persuasions, we remain prone to cultural polarization over the means of doing so. Indeed, the prospect of agreement on the consequences of law has diminished, not grown, with advancement in collective knowledge, precisely because we enjoy an unprecedented degree of cultural pluralism and hence an unprecedented number of competing cultural certifiers of truth.

If there’s a way to mitigate this condition of cognitive illiberalism, it is by reforming our political discourse. Liberal discourse norms enjoin us to suppress reference to partisan visions of the good when we engage in political advocacy. But this injunction does little to mitigate illiberal forms of status competition: because what we believe reflects who we are (culturally speaking), citizens readily perceive even value-denuded instrumental justifications for law as partisan affirmations of certain worldviews over others.

Rather than implausibly deny our cultural partiality, we should embrace it. The norm of expressive overdetermination would oblige political actors not just to seek affirmation of their worldviews in law, but to cooperate in forming policies that allow persons of opposing worldviews to do so at the same time. Under these circumstances, citizens of diverse cultural orientations are more likely to agree on the facts—and to get them right—because expressive overdetermination erases the status threats that make individuals resist accurate information. But even more importantly, participation in the framing of policies that bear diverse meanings can be expected to excite self-reinforcing, reciprocal motivations that make a culture of political pluralism sustainable.

Ought, it is said, implies can. Contrary to the central injunction of liberalism, we cannot, as a cognitive matter, justify laws on grounds that are genuinely free of our attachments to competing understandings of the good life. But through a more sophisticated understanding of social psychology, it remains possible to construct a form of political discourse that conveys genuine respect for our cultural diversity.

Friday
Apr212017

The challenge of doing science journalism in a polluted science communication environment

 

Boy, this is a tough one.

It's not hard to see how linking Zika to climate change risks infecting the former with the polarizing virus carried by the latter.  Not hard, either to model such an effect in the lab (Kahan, Jamieson, Landrum & Winneg 2017).

On the other hand, if this piece is conveying the truth about the health hazards being created or magnified by climate change, isn't such reporting essential?

I guess I have two reactions.

First, highlighting Gore is not a good idea.  He brands as a partisan issue anything he gets involved with.

Second, the most important thing is that science journalists engage in shared critical reflection on dilemmas of this kind. Such reflection attests to and helps inculcate a professional norm, one that assures journalists exercise their judgment in a manner sensitive to the impact of their craft on the science communication environment.

That sort of norm, and the quality of deliberation it promotes, were clearly on display in the science community's debate about the effect of their upcoming march on Washington.

The importance of having a collective discussion like that, on all the occasions that warrant it, might turn out to be the most valuable lesson of that event.

So what do you think?


 

Thursday
Apr202017

UMass SES program: a new science of science communication for a world itself quite new (lecture summary & slides)

Did lecture yesterday at UMass Amherst to remark the launch of the University’s School of Earth & Sustainability program.  Members of the audience asked fantastic questions, leaving me once again regretful that I had not spoken for a shorter period of time in order to make room for more audience reactions.

My message was that the SES program is a model—one of many, but many are needed to build a knowledge base—of how to combine the study of decision-relevant science with the study of science communication. Doing so is essential to assure that the value of the  former is recognized by the public and, in particular, not annihilated by knowledge-enervating forms of group status competition

What causes conflict over decision-relevant science, I argued, is a polluted science communication environment. Devising means of protecting that environment and repairing it when protective measures fail should be one of the primary goals of the science of science communication. 

UMass’s  School of Earth and Sustainability is commendably modeling that understanding, and we can all learn a lot from—and be inspired by--what they are doing.

The expositional strategy I used to guide the audience into critical engagement with this thesis consisted in setting up & knocking down popular misconceptions about the source of public conflict over science, including deficits in public science comprehension;  creeping anti-science attitudes in American society; and orchestrated misinformation.  

Throughout the presentation I also  took aim at the asymmetry thesis, which posits that the incidence of identity-protective cognition is disproportionately concentrated on the right in American society.  I’ll have more to say about that “tomorrow,”™  when I give me reactions to a new pair of newly released opposing meta-analyses on this topic, one by Jon Jost & another by Peter Ditto & collaborators.

Slides here.

Wednesday
Apr192017

Where am I? version 502

Two events this week:

 

Tuesday
Apr182017

Last session in Science of Science Communication 2017

Not usually where we end, but frolicks & detours along the way were worthwhile

Monday
Apr172017

Another genuinely informative study of consensus messaging