follow CCP

Recent blog entries
popular papers

Science Curiosity and Political Information Processing

What Is the "Science of Science Communication"?

Climate-Science Communication and the Measurement Problem

Ideology, Motivated Cognition, and Cognitive Reflection: An Experimental Study

'Ideology' or 'Situation Sense'? An Experimental Investigation of Motivated Reasoning and Professional Judgment

A Risky Science Communication Environment for Vaccines

Motivated Numeracy and Enlightened Self-Government

Ideology, Motivated Cognition, and Cognitive Reflection: An Experimental Study

Making Climate Science Communication Evidence-based—All the Way Down 

Neutral Principles, Motivated Cognition, and Some Problems for Constitutional Law 

Cultural Cognition of Scientific Consensus
 

The Tragedy of the Risk-Perception Commons: Science Literacy and Climate Change

"They Saw a Protest": Cognitive Illiberalism and the Speech-Conduct Distinction 

Geoengineering and the Science Communication Environment: a Cross-Cultural Experiment

Fixing the Communications Failure

Why We Are Poles Apart on Climate Change

The Cognitively Illiberal State 

Who Fears the HPV Vaccine, Who Doesn't, and Why? An Experimental Study

Cultural Cognition of the Risks and Benefits of Nanotechnology

Whose Eyes Are You Going to Believe? An Empirical Examination of Scott v. Harris

Cultural Cognition and Public Policy

Culture, Cognition, and Consent: Who Perceives What, and Why, in "Acquaintance Rape" Cases

Culture and Identity-Protective Cognition: Explaining the White Male Effect

Fear of Democracy: A Cultural Evaluation of Sunstein on Risk

Cultural Cognition as a Conception of the Cultural Theory of Risk

« The fractal nature of the "knowledge deficit" hypothesis: Biases & heuristics, system 1 & 2, and cultural cognition | Main | Still more evidence of my preternatural ability to change people's minds: my refutation of Krugman's critique of Klein's article convinces Klein that Krugman's critique was right »
Monday
Apr282014

Science and public policy: Who distrusts whom about what?

More or less what I said at really great NSF-sponsored "trust" workshop at University of Nebraska this weekend. Slides here

1.  What public distrust of science?

I want to address the relationship of trust to the science communication problem.

As I use the term, “the science communication problem” refers to the failure of valid, compelling, and widely accessible scientific evidence to dispel persistent cultural conflict over risks or other policy-relevant facts to which that evidence directly speaks. 

The climate change debate is the most spectacular current example, but it is not the only instance of the science communication problem. Historically, public controversy over the safety of nuclear power fit this description. Another contemporary example is the political dispute over the risks and benefits of the HPV vaccine.

Distrust of science is a common explanation for the science communication problem. The authority of science, it is asserted, is in decline, particularly among individuals of a relatively “conservative” political outlook.

This is an empirical claim.  What evidence is there for believing that the public trusts scientists or scientific knowledge less today than it once did? 

The NSF, which is sponsoring this very informative conference, has been compiling evidence on public attitudes toward science for quite some time as part of its annual Science Indicators series.

One measure of how the pubic regards science is its expressed support for federal funding of scientific research.  In 1985, the public supported federal science funding by a margin of about 80% to 20%. Today the margin in the same—as it was at every point between then and now.

Back in 1981, the proportion of the public who thought that the government was spending too little to support scientific research outnumbered the proportion who thought that the government was spending too much by a margin of 3:2. 

Today around four times as many people say the government is spending too little on scientific research than say it is spending too much.

Yes, there is mounting congressional resistance to funding science in the U.S.--but that's not because of any creeping "anti-science" sensibility in the U.S. public. 

Still aren't sure about that?

Well, how would you feel if your child told you he or she was marrying a scientist? About 70% of the public in 1983 said that would make them happy.  The proportion who said that grew to 80% by 2001, and grew another 5% or so in the last decade.

Are “scientists … helping to solve challenging problems”? Are they “dedicated people who work for the good of humanity”?

About 90% of Americans say yes.

Do you think you can squeeze the 75% of Republicans who say they “don’t believe in human-caused climate change” from the remainder? Better double check your math.

In sum, there isn’t any evidence that creeping distrust in science explains the science communication problem, because there’s no evidence either that Americans don’t trust scientists or that fewer of them trust them now than in the past.

Of course, if you like, you can treat the science communication problem itself as proof of such distrust.  Necessarily, you might say, the public distrusts scientists if members of the public are in conflict over matters on which scientists aren’t.

But then the “public distrust in science” explanation becomes analytic rather than empirical.  It becomes, in other words, not an explanation for the science communication problem but a restatement of it.

If we want to identify the source of the science communication problem, simply defining the problem as a form of “public distrust” in science—on top of being a weird thing to do, given the abundant evidence that the American public reveres science and scientists—necessarily fails to tell us what we are interested in figuring out, and confuses a lot of people who want to make things better.

2. The impact of cultural distrust on perceptions of what scientists believe

So rather than define the science communication problem as evincing “public distrust in science,” I’m going to offer an evidence-based assessment of its cause.

A premise of this explanation, in fact, is that the public does trust science.

As reflected in the sorts of attitudinal items in the NSF indicators and other sources, members of the public in the U.S. overwhelmingly recognize the authority of science and agree that the individual and collective decisionmaking should be informed by the best available scientific evidence.

But diverse members of the public, I’ll argue, distrust one another when they perceive that the status of the cultural groups they belong to are being adjudicated by the state’s adoption of a policy or law premised on a disputed risk or comparable fact.

When risks and other facts that admit of scientific investigation become the focus of cultural status competition, members of opposing groups will be unconsciously motivated to construe all manner of evidence in a manner that reinforces their commitment to the positions that predominate within their respective groups.

One source of evidence—indeed, the most important one—will be the weight of opinion among expert scientists.

As a result, culturally diverse people, all of whom trust scientists but who disagree with one another’s intentions on policy issues that have come to symbolize clashing worldviews, will end up culturally polarized over what scientists believe about the factual presuppositions of each other's position.

That is the science communication problem.

I will present evidence from two (NSF-funded!) studies that support this account.

3.  Cultural cognition of scientific consensus

The first was an experiment on how cultural cognition influences perceptions of scientific consensus on climate change, nuclear waste disposal, and the effect of “concealed carry” laws.

The cultural cognition thesis holds that individuals can be expected to form perceptions of risk and like facts that reflect and reinforce their commitment to identity-defining affinity groups.

For the most part, Individuals have a bigger stake in forming identity-congruent beliefs on societal risks than they have in forming best-evidence-congruent ones. If a person makes a mistake about the best evidence on climate change, for example, that won’t affect the risk that that individual or anyone he or she cares about faces: as a solitary individual, that person’s behavior (as consumer, voter, etc.) is too inconsequential to have an impact.

But if that person makes a “mistake” in relation to the view that dominates in his or her affinity group, the consequences could be quite dire indeed.  Given what climate change beliefs now signify about one’s group membership and loyalties, someone who forms a culturally non-conformity view risks estrangement from those on whose good opinion that person’s welfare—material and emotional—depends.

It is perfectly rational, in these circumstances, for individuals to engage information in a manner that reliably connects their beliefs to their cultural identities than to the best scientific evidence. Indeed, experimental evidence suggests that the more proficient that person’s critical reasoning capacities, the more successful he or she will be in fitting all manner of evidence to the position that expresses his or her group identity.

What most scientists in a particular field believe is one such form of evidence.  So we hypothesized that culturally diverse individuals would construe evidence of what experts believe in a biased fashion supportive of the position that predominates in their respective groups.

In the experiment, we showed study subjects the pictures and resumes of three highly credentialed scientists and asked whether they were “experts” (as one could reasonably have inferred from their training and academic posts) in the domains of climate change, nuclear power, and gun control.

Half the subjects were shown a book expert in which the featured scientist took the “high risk” position on the relevant issue (“scientific consensus that humans are causing climate change”; “deep geologic isolation of nuclear wastes is extremely hazardous”; “permitting citizens to carry concealed guns in public increases crime”), and half a book excerpt in which the same scientist too the “low risk” position (“evidence on climate change inconclusive”; “deep geologic isolation of nuclear wastes poses no serious hazards”; “allowing citizens to carry concealed guns reduces crime”).

If the featured scientist’s view matched the one dominant in a subject’s cultural group, the subject was highly likely to deem that scientists an “expert” whose views a reasonable citizen would take into account. 

But if that same scientist was depicted as taking the position contrary to the one that was dominant in the subject’s group, then she was highly likely to perceive that the scientist lacked expertise on the subject in question.

This result was consistent with our hypotheses.

If individuals in the real-world selectively credit or discredit evidence on “what experts believe” in this manner, then individuals of diverse cultural outlooks will end up polarized on what scientific consensus is.

And this is exactly the case.  In an observational component of the study, we found that the vast majority of subjects perceived “scientific consensus” to be consistent with the position that was dominant among members of their respective cultural groups.

Judged in relation to National Academy of Sciences “expert consensus” reports, moreover, all of the opposing cultural groups turned out to be equally bad in discerning what the weight of scientific opinion was across these three issues.

In sum, they all agreed that policy should be informed by the weight of expert scientific opinion. 

But because policies in question turned on disputed facts symbolically associated with membership in opposing groups, they were motivated by identity-protective cognition to assess evidence of what scientists believe in a biased fashion.

4.  The cultural credibility heuristic

The second study involved perceptions of the risks and benefits of the HPV vaccine.

The CDC’s 2006 recommendation that the vaccine be added to the schedule of immunizations required as a condition of middle school enrollment, although only for girls, provoked intense political controversy across the U.S. in the years immediately thereafter.

In our study, we found that there was very mild cultural polarization on the safety of the HPV vaccine among subjects’ whose views were solicited in a survey.

The degree of cultural polarization was substantially more pronounced, however, among subjects who were first supplied with balanced information on the vaccines’ potential risks and expected benefits.  Consistent with the cultural cognition thesis, the subjects were selectively crediting and discrediting the information we supplied in patterns that reflected their stake in forming identity-supportive beliefs.

But still another group of subjects assessed the risks and benefits of the HPV vaccine after being furnished the same information from debating “public health experts.” These “experts” were ones whose appearances and backgrounds, a separate pretest had shown, would induce study subjects to competing cultural identities to them.

In this experiment condition, subjects’ assessments of the risks and benefits of the HPV vaccine turned decisively on the degrees of affinity between the perceived cultural identities of the experts and the study subjects’ own identities.

If subjects observed the position that they were culturally predisposed to accept being advanced by the “expert” they were likely to perceive as having values akin to theirs, and the position they were predisposed to reject being advanced by the “expert” they were likely to perceive as having values alien to their own, then polarization was amplified all the more.

But where subjects saw the expert they were likely to perceive as sharing their values advancing the position they were predisposed to reject, and the expert they were likely to perceive as holding alien values advancing the position they were predisposed to accept, subjects of diverse cultural identities flipped positions entirely.

The subjects, then, trusted the scientific experts.

Indeed, polarization disappeared when experts whom culturally diverse subjects trusted told them the position they were predisposed to accept was wrong.

But the subjects remained predisposed to construe information in a manner protective of their cultural identities.

As a result, when they were furnished tacit cues that opposing positions on the HPV vaccine risks corresponded to membership in competing cultural groups, they credited the expert whose values they tacitly perceived as closest to their own—a result that intensified polarization when subjects' predispositions were reinforced by those cues.

5.  A prescription

The practical upshot of these studies is straightforward.

To translate public trust in science into convergence on science-informed policy, it is essential to protect decision-relevant science from entanglement in culturally antagonistic meanings.

No risk issue is necessarily constrained to take on such meanings

There was nothing inevitable, for example, about the HPV vaccine becoming a focus of cultural status conflict.  It could easily, instead, have been assimilated uneventfully into public health practice in the same manner as the HBV vaccine.  Like the HPV vaccine, the HBV vaccine immunizes recipients against a sexually transmitted disease (hepatitis-b), was recommended for universal adolescent vaccination by the CDC, and thereafter was added to the school-enrollment schedules of nearly every state.

The HBV vaccine had uptake rates of over 90% during the years in which the safety of the HPV vaccine was a matter of intense, and intensely polarizing, political controversy in the U.S.

The reason HPV ended up becoming suffused with antagonistic cultural meanings had to do with ill-advised decisions, pushed for by the vaccine’s manufacturer and acquiesced in without protest by the FDA, that made it certain that members of the public would learn about the vaccine for the first time not from their pediatricians, as they had with the HBV vaccine, but from news reports on the controversy occasioned by a high-profile, nationwide campaign to secure legislative enactments of a “girls’ only STD shot” as a condition of school enrollment.

The risks associated with introducing the HPV vaccine in this manner were not only foreseeable but foreseen and even empirically studied at the time.

Warnings about this danger were not so much rejected as never considered—because there is no mechanism in place in the regulatory process for assessing how science-informed policymaking interacts with cultural meanings.

The U.S. is a pro-science culture to its core.

But it lacks a commitment to evidence-based methods and procedures for assuring that what is known to science becomes known to those whose decisions, individual and collective, it can profitably inform.

The “declining trust in science” trope is itself a manifestation of our evidence-free science communication culture.

Those who want to solve the science communication problem should resist this & all the other just-so stories that are offered as explanations of it.

They should also steer clear of those drawn to the playground-quality political discourse that features competing tallies of whose “side” is “more anti-science.”

And they should instead combine their energies to the development of a new political science of science communication that reflects an appropriately evidence-based orientation toward the challenge of enabling the members of a pluralistic liberal society to reliably recognize what’s known by science.

 

 

PrintView Printer Friendly Version

EmailEmail Article to Friend

Reader Comments (19)

"In sum, there isn’t any evidence that creeping distrust in science explains the science communication problem, because there’s no evidence either that Americans don’t trust scientists or that fewer of them trust them now than in the past."

So how do you reconcile these data?

http://www.asanet.org/images/journals/docs/pdf/asr/Apr12ASRFeature.pdf

This study explores time trends in public trust in science in the United States from 1974 to 2010. More precisely, I test Mooney’s (2005) claim that conservatives in the United States have become increasingly distrustful of science. Using data from the 1974 to 2010 General Social Survey, I examine group differences in trust in science and group-specific change in these attitudes over time. Results show that group differences in trust in science are largely stable over the period, except for respondents identifying as conservative. Conservatives began the period with the highest trust in science, relative to liberals and moderates, and ended the period with the lowest. The patterns for science are also unique when compared to public trust in other secular institutions. Results show enduring differences in trust in science by social class, ethnicity, gender, church attendance, and region. I explore the implications of these findings, specifically, the potential for political divisions to emerge over the cultural authority of science and the social role of experts in the formation of public policy.

I don't think that creeping distrust in science explains the "science communication problem" (whatever that is), but I do think that trust in science, and trends in trust in science, are correlated with political ideology and other cultural group orientations (such as religiosity).

Now what people express to pollsters about their "trust in science" is obviously subject to the biases found in all self-report data. When a conservative says that s/he has less trust in science than s/he used to, what does that mean? Does it mean that they no longer think that their GPS will work, or that they are less likely to take the medication prescribed by their doctor? Of course not. But the poll data on "trust in science," and the differential patterns associated with political orientation, is telling us something.

And I still don't get why you isolate science communication from communication of evidence related to any number of culturally divisive issues? You never answered my question from a previous thread:

..."What differentiates evidence-based science communication from any other brand of evidence-based communication - say evidence-based communication about gun control or the merits/demerits of Obamacare? Why do you single out evidence-based science communication? Merely because that is your point of interest, or because you believe it is somehow a different species than other domains of evidence-based communication?...

April 28, 2014 | Unregistered CommenterJoshua

@Joshua:

I've commented on that study before.

It is interesting but it doesn't support any clear inference about public attitudes toward science or political variance in them.

My own guess is that the minor trend reported in the item, which asks how much confidence rspts have in those "running" various "institutions" including the "scientific community," was picking up on recent partisan divisions over climate change.

But because it's hard to be confident what any particular survey item is really measuring, it's not a good idea to pick a single item from a related bunch & treat it as conveying some particular significance that evades the others.

If the trend on this item supported the view that there is "declining trust" in science among Republicans, one would certainly expect to see the same trends reflected in all these other items, most of which come from the same GSS surveys.

It's also not a good idea to try to draw an inference from a correlation w/o looking to see if that inference is consistent with what can be seen in the raw data.

Take a look at 2014 NSF Science indicator results for the item you are focusing on. They show what the item always has: that those "running" the "instiutoin" of the "scientific community" enjoy more public "confidence" those "running" any other except the military, with whom they aren't very far behind. I seriously considered putting that very graph in to illustrate my point (although I don't like NSF's yucky horizontal bar charts).

In any case, variance in the tiny fraction of the population that expresses *low* confidence in those "running" the "scientific community" is not a plausible explanation for highly polarized disagreements on scientific issues--ones in which those on both sides emphatically assert that their position is the one consistent with scientific consensus!

I like the paper you cite & I really admire the probing & thoughtful style of the author of it, Gordon Gauchat, who has done a lot of cool work on public attitudes toward science.

But as I've stated, the interesting thing Gauchat is measuring in this study doesn't evidence growing "distrust in science" among republicans or anyone else, much less an explanation for highly polarized disputes like the one over climate change, HPV, nuclear, etc .

April 28, 2014 | Registered CommenterDan Kahan

"My own guess is that the minor trend reported in the item, which asks how much confidence rspts have in those "running" various "institutions" including the "scientific community," was picking up on recent partisan divisions over climate change."


Don't get me wrong. I have been arguing quite often against a commonly found meme at "skeptical" websites, which runs something like this: "Public trust in science and scientists is in crisis, and the reason for that crisis is that activist scientists have been stoking "scare stories" about climate change which, over time, are obviously over-hyped. Oh, and climategate also has eroded trust in science. Oh, and of course, my own personal views has nothing to do with my argument about the cause-and-efffect behind public trust in science, and the fact that I'm a "skeptic" is purely coincidental to my conclusions. It's not like I'm projecting my own views that activist scientists are over-hyping the dangers of climate change."

In particular, in an entirely un-skeptical fashion, they often point to this poll,

http://wattsupwiththat.com/2011/08/03/rasmussen-poll-69-say-it%E2%80%99s-likely-scientists-have-falsified-global-warming-research/

with the mistaken belief that somehow cross-sectional data can paint a longitudinal picture, and that citing polls about attitudes towards scientists tells you much if you haven't compared them to attitudes towards plumbers or priests or engineers or climate "skeptics."

I can't tell you how often I run into that argument made by "skeptics." If I'm not mistaken, our very own NiV is an adherent.

Now in those arguments I've had, I have often read it said that climate change is what caused a drop in "trust in science/scientists," but I don't buy it as reflective of a larger trend because of how I interpret the mechanisms of motivated reasoning - which is that group identification drives views on a variety of issues (and people filter information based on their pre-existing orientation).

In contrast to you, I see the "climate change is what caused a change in attitudes towards science" as the least plausible explanations of the three that you speculated about in your other post.

In contrast, you wrote:

"The final possible explanation for the linked trends (or the final one I can think of right now) is that the GSS item measures a genuine and growing distrust of scientists among conservatives by conservatives and that growing distrust is itself what caused conservatives to become distrustful of climate change science in the mid to late 1990s. ...That strikes me as the least plausible explanation, actually. Why did conservatives just happen to get distrustful of scientists at that very moment? "

It's not an "at that very moment" trend - and it coincides with considerable trends across society more generally - specifically a deliberate initiative from conservatives to increase the influence of the "religious right" (and associated views about the science related to evolution, stem cell research, etc.) and with the mainstreaming of "drowning government in a bathtub" and "those damn liberal colleges are brainwashing out kids" ideology.


You say this:

"Indeed, Gauchat's study would have lent more support to the hypothesis that some dispositional distrust of science is the cause of conservative resistance to climate-change science if he had found that conservatives distrusted scientists well before evidence of climate change started to accumulate."

The trend does become clear before politicized polarization about climate change became as prominent as it is today, and as I said above, it does coincide with other socio-political trends.


My intuition tells me that there has been a drop over time in self-reported "trust in science" among conservatives in the sense that it reflects a trend of increased distrust in the educational institutions that train scientists, the governmental institutions that fund science research, the academic institutions that hire many scientists, and the governmental institutions that develop policy on the basis of evidence produced by scientists.

"In any case, variance in the tiny fraction of the population that expresses *low* confidence in those "running" the "scientific community" is not a plausible explanation for highly polarized disagreements on scientific issues--ones in which those on both sides emphatically assert that their position is the one consistent with scientific consensus!:"

Seems to me that you are conflating issues. It is possible to see that there is a downward trend in self-reported "trust in science" among conservatives without offering that as an explanation for belief among conservatives about climate change. They could both be manifestations of the came causal causal mechanisms; religious and political identifications are the causal mechanism for both outcomes. Again, I am assuming there that "trust in science" is not exactly "trust in science," but a disassociation with the institutions commonly associated with science and with mainstream science on issues that overlap with religious views.

April 28, 2014 | Unregistered CommenterJoshua

@Joshua:

Best to test intuitions w/ evidence rather than construe ambiguous evidence (like the very modest shift in proportion of R's who shifted from "a great deal" to "some" confidence on this one awkward item) to fit intuition.

Don't you think?

I show you lots of trend data that one wouldn't expect to see if you were right about a "trend" in R views toward "science" generally.

I show you experiments that support an account of how there could be polarization over science-informed policy issues in the midst of universal "trust" in scientists that all sorts of surveys & things in our daily life show us exist.

I remind you that R's say that they believe their position on climate change is consistent w/ scientific consensus (& dems that theirs on guns & nuclear is).

You just ignore all that & dig in behind this 1 silly, ambiguous item, variantion in which is too laughably tiny to explain anything (& which shows in fact "those who run" science are massively more trusted than those "who run" religion!) & say it must mean what your intution tells you: Republicans are hostile to science....

(BTW, you realize that < 7% of rspts over the entire period covered in Gauchat's study selected the negative "hardly any" response? Also that "moderates" have consistently been the group with "lowest" confidence on this measure? That doesn't make sense if the item means what you think either.)

Yes, R's reject evidnece on climate change. But if you treat that as "evidence" in support of the "growing distrust" thesis, you are, as I said in the post, turning what was a hypothesis into a definition.

If you want to propose "distrust in institutions" -- fine. No recognizable group distrusts the "instiuttion" of science -- just look at the % of "no confidence" respondnets in the very item you are relying on.

But various recognizable groups of citizens distrust various other "instiutions"-- like kindustry, govt, etc. So in some context in which "confidence in scientist" equates to accepting the stance of one of those institutions on an issue that is freighted with cutlurally antagonistic meanings, then you see lots of pushback against particular scientiric position.

If you want to see how "distrusting" of university scientists dems can get when the "instiuttjion" they distrust is in alignment with scientific evidence, talk to dems about fracking. Of course, b/e dems, like everyone else, revere "scientists," they'll insist that the ones they disagree w/ are not representing the "consensus view." So calling them "anti-science" b/c they see things as they do on fracking is just mindless.

Don't feed the ravenous appetite of those who (whatever their identity) are motivated to think that everyone who disagrees with them is stupid & "anti-science."

That view is wrong. And it both reflects & reinforces the problems we are facing in our polluted science communication environment.

April 28, 2014 | Unregistered Commenterdmk38

"I can't tell you how often I run into that argument made by "skeptics." If I'm not mistaken, our very own NiV is an adherent."

Hmm. I've certainly argued that scientists not being careful to check what they're saying *risks* the credibility of science. I'm not sure if I've argued that it has actually reduced it. Although as Dan says, I expect it depends on the context in which you ask the question. I think that if you say "scientist" without any particular context, people tend to think of science generally - physics, chemistry, astronomy, biology, and so on. Most science is not as pathological as climate science, and scientists generally do get and deserve a lot of trust.

But if you ask the question in the context of those scientist-activists who generate scare stories, push quack treatments, and support political causes, I think you'll find a recognition that not all scientists are the same, and that one shouldn't necessarily trust a claim just because it's a scientist saying it. The excesses of climate alarmism certainly don't help with that.

Actually, I don't think the public distrusting scientists is any bad thing. What would concern me is people distrusting science.

Incidentally, looking at Dan's postscript bar chart, I read that as saying only slightly over 40% have "a great deal" of confidence in the people running science, which means slightly under 60% don't. "Some" could mean 'quite a lot' or 'barely any'. It's hardly a ringing endorsement, is it? I mean, the fact that they come out ahead of supreme court judges (who I keep reading are selected for their political biases), professional priests (uh huh...), company executives (is there any group of people more vilified?), and, good lord, the bankers (Oh. Yeah. So there is.) ... is it really anything to write home about? You're pleased that you're more trusted than the bankers?!

I think, probably, if you asked how many people had a great deal of confidence in science, as a body of knowledge and as a way of finding things out, you'd probably get a higher figure. Every sceptic I know has a deep respect for science - it's one of the reasons they get so angry about its corruption. It's very easy for establishment figures to confuse the philosophy and practice of science with the institutions and authorities of the profession of scientists, and to mistake opposition to the latter for opposition to the former.

It would be interesting to know how the general public really did perceive the relationship between scientific method and scientific authority. Suppose you compare the effect of a trusted or untrusted expert simply asserting the result, and a trusted/untrusted expert presenting a logical argument to demonstrate the result? For those people who understood the argument (which you would need to test) does trust/distrust in an expert override directly-perceived logic and evidence?

If people would rather trust a same-tribe expert making unsupported assertions than a counter-tribe expert providing easily understood arguments and evidence, that would be dramatic evidence for the strength of the effect. I honestly don't know if they would. I'd like to think they wouldn't, but I recall Orwell and his concept of doublethink. When it comes to understanding how people behave where ideologies are involved, Orwell very often knew exactly what he was talking about.

April 28, 2014 | Unregistered CommenterNiV

Dan -

" But if you treat that as "evidence" in support of the "growing distrust" thesis, you are, as I said in the post, turning what was a hypothesis into a definition."

We see to be talking past each other somehow, because I thought that I have made it very clear that is not what I'm saying. No, that is not remotely a part of my "thesis."

"I show you lots of trend data that one wouldn't expect to see if you were right about a "trend" in R views toward "science" generally. "

This, also, seems to be not taking in what I have said. I have been clear that I'm not talking about a trend in R's views towards science, generally. I am talking about how they respond to polls that ask them about their level of trust in science, and saying that their answers are not really a reflection of a trend in attitude about science per se, but towards the cultural overlays that cohere to a question about their level of trust in science. As I said, I don't think that they are going to change in their GPS habits or in the confidence they place in their doctors.

In addition, you are simply not addressing the other points, that Gauchat also talks about in his analysis, such as the growth of the religious right, trends in attitudes of mainstream Republicans towards societal (and particularly federal) institutions, and the simple fact that the trend in his data starts well before climate change became a significantly polarizing issue.

"BTW, you realize that < 7% of rspts over the entire period covered in Gauchat's study selected the negative "hardly any" response?"

What I think is important is not the change in the opinions expressed in the respondents overall - as yes, with most respondents there was not change - but the change in the responses from the 1/3 or so who identify as conservatives. The point is that there was a significant change within that subset of the population. So why would there be a significant change in that subset of the population and no change in the others? You say because of climate change. I say that seems implausible because it doesn't take into account other prominent societal factors, and because the trend moved well before climate change became so virulently associated with political identities.

"& say it must mean what your intution tells you: Republicans are hostile to science......."

No, actually, I didn't say that, and I find it really strange that you characterize what I'm saying in such a fashion, as I have attempted to make it clear that is not what I'm saying. It appears that you are responding to some caricature (who argues that Republicans are "anti-science" - something I have never argued) more than you are responding to what I've written.

"Don't feed the ravenous appetite of those who (whatever their identity) are motivated to think that everyone who disagrees with them is stupid & "anti-science.""

Once again, this is a mis-reading of what I have said.

" So calling them "anti-science" b/c they see things as they do on fracking is just mindless."

Nothing that I've said supports you attributing such a view to me.

"If you want to see how "distrusting" of university scientists dems can get when the "instiuttjion" they distrust is in alignment with scientific evidence, talk to dems about fracking. "

This does not reflect overall attitudes. Are you seriously arguing that there is not a significant difference in attitudes among liberals and conservatives, generally and relatively, when discussing the trustworthiness of our universities that train scientists? Seriously? Conflating that difference with being "anti-" or "pro-" science is not what I am doing.

I guess if you are going to continue to mis-attribute arguments and perspectives to me, and ignore the points that I'm making, then I should just move on, but I would prefer if you would stop responding to my comments with what appears to be, something of a biolerplate response that might be suitable directed at someone else.

April 28, 2014 | Unregistered CommenterJoshua

@NiV

It's just a weird item. Forget what it means to select "some"; what does it mean about the person who came up with the survey item that he or she used that as a category between "a great deal" & "hardly at all"?... IN any case, there's only a tiny difference in likelihood of picking "great deal" or "some" for D's & R's--so it is even more strained to yank this one item out & try to build a "REpublicans are anti-science" edifice on top of it.

This is the difference between "opinion pollster pseudoscience & psychometric measurement approach.

Former latches onto solitary ambiguous item & makes grand claims about what the "public" believes based on it.

Latter tries to figure out what survey items are really measuring. Often the answer is nothing in particular, esp for items that appear in issue du jour polls. But if it is anything it's some not-directly observable attitude that can be interpreted and precisely measured only w/ multiple items.

The *set* of items I cited are all plausibly viewed as indicating whether people have a positve or negative attitude toward scientists. They relfect an overwhelmingly positive view when looked at as a whole.

The "confidence" in "those who run institutions" item shows science at top, just a tiny bit behind miliary & ahead of all manner of other institutions. It's bizarre to treat it as more informative -- and bizarre to think that very small correlations w/ partisnaship in the response categories that both reflect positive attitudes (moderates, as I said, historically are least "cofnident"; what the helll does that mean?!) are plausible explanations for big differences in attitude in the world (like those between Ds & Rs over climate change), particularly in the face of many more convincing pieces of evidence that are inconsinstent with that inference.

Figuring out what causes what is not a game where players "win" the right to persist in their preconceptions so long as they can find a single piece of evidence that ambiguously fits it when construed in light of the preconception.

It's an exercise in evaluating the weight of all evidence & the strength of inferences that can be drawn from them.

April 28, 2014 | Registered CommenterDan Kahan

NiV -

"I'm not sure if I've argued that it has actually reduced it."

Then I stand corrected. I have run into the argument countless times at "skeptic" blogs, and It is one of the single-most un-skeptical arguments I have run across at those blogs, and is a big reason why I put skeptics in quotation marks. What I find particularly amusing is when people who self-identify as a "skeptic" try to draw longitudinal conclusions from cross-sectional data, as we see when "skeptics" point to the Rasmussen poll as if it somehow verifies that climate scientists have undermined the public's faith in science.

"But if you ask the question in the context of those scientist-activists who generate scare stories, push quack treatments, and support political causes, I think you'll find a recognition that not all scientists are the same, and that one shouldn't necessarily trust a claim just because it's a scientist saying it. The excesses of climate alarmism certainly don't help with that."

Heh. You seem to be dancing with ambiguity there.

Tell me, do you see any solid evidence that "activism" among climate scientists has diminished the public's view about climate scientists? I mean the public in general. I'm looking for evidence controlled for the phenomenon whereby conservatives and libertarians are looking to confirm their biases à la motivated reasoning.

I'm thinking of the studies that putatively measure the impact of "climategate." They show that a relatively small % of the public had their views changed significantly by "climategate." Among those that did, some had their concern about climate change diminished, and some had it increased (a smaller number, relatively speaking) - but for those whose opinions were changed t be less concerned, there was a clear correlation with conservative and libertarian ideology. So what was going on? Was it motivated reasoning? Or was it "activists" undermining faith in climate science? And with those who said it was the "activists" undermining their faith in climate science, how do we know that they weren't just spinning "climategate" as a rhetorical device? There were no pre- and post- analyses to verify that in fact, they had some greater degree of trust in climate scientists before "climategate."

So I think you've done a good job of basically saying anything's possible (which I agree with) and then implying a cause and effect relationship w/r/t climate science specifically as singled out from other sciences, but stuck in a bit of plausible deniability - because there is no actual evidence of the what you've described taking place. There is only "common sense" speculation that "activism" has undermined trust in climate science. Well, I understand the common sense aspect of the speculation, but I also think that it's probably wrong, and that there are deeper forces at play that lie beneath a surface reaction to "activist" scientists. The whole notion of "activism" is, IMO, just another subjective determination that climate warriors run through their cultural cognition filters to confirm their biases. I run into this often at Judith's - where her entire definition of "activism" is subjective, and IMO, defined in such a way as to confirm biases. "Activism" is bad when "they" do it, and just fine when "we" do it.

April 28, 2014 | Unregistered CommenterJoshua

@Joshua:

If you think this item is meauusring climate change attitudes, then they necessarily aren't measuring something that *caused* the attitudes. The "decline in trust" claim is a causal one.

What other items on science & partisanship are you referring to?

Sure REpublicans hate "liberal universities." That's a cultural-id sort of posturing too.

But how exactly does that show "loss of trust" in science, much less that that *causes* conflicts over issues on which Republicans, like everyone else, insist that the positions they support are the ones scientsits ensorse?

And why ignore eperiments that actually try to test hypotheses directly rather than make strained inferences from ambiguous survey items?

Why ignore tons of related evidence about why there is conflict over science-informed policies -- unless one is motivated to conclude that the only reason peopole disagree with you is that they are closed minded & anti-science?

The religious right point in Gauchat? As I pointed out in the post on his article, his own data show that religion makes "liberals" as well as "conservatives" a tiny bit more likely to pick a lower level of "confidence" response in relation to "those who run" science.

I'm sure religious right is responsible for all sorts of things, including attitudes toward science-informed policies of various sorts.

But my views on that aren't affected in any way by the variance in the item Gauchat reports measuring -- b/c he reports that religion operates independently of partisanship here.

As I said, test preconceptions w/ evidence; don't construe evidence in light of preconceptions

April 28, 2014 | Registered CommenterDan Kahan

Dan -

"If you think this item is meauusring climate change attitudes,"

This continues to be a strange exchange. I have no idea what "this item" you're referring to. I have been discussing, all along, the Gauchat article I linked, and I have not been referring to anything related to climate change except that in your earlier post of the Gaughat article, you attributed his findings, basically, to the polarization related to climate change, and I find that to be implausible.

"Sure REpublicans hate "liberal universities." That's a cultural-id sort of posturing too....But how exactly does that show "loss of trust" in science, much less that that *causes* conflicts over issues on which Republicans, like everyone else, insist that the positions they support are the ones scientsits ensorse? "

It doesn't. I never said that it does, although you repeated said that's what I was arguing. I am saying that the Republican hatred of "liberal" universities is relevant to Gauchat's findings about how conservatives respond on a poll when they are asked about their "trust in science." I am saying (how many times do I have to repeat it?) I don't think it really is a measure of their "trust in science," (they aren't going to flush their GPSs down their toilet), but it is a measure of their attitudes towards the cultural associations with scientific institutions.

"And why ignore eperiments that actually try to test hypotheses directly rather than make strained inferences from ambiguous survey items?"

I can only assume that you're asking me that question because you still aren't getting what I'm saying. The hypothesis that you're testing is not one that I am postulating. Ir's rather ironic that I am in this position - because I have, countless times, gotten into arguments at "skeptic" blogs where my position is that the arguments about a public decrease in "trust in science" (and which attribute that reduction in trust to climate scientists and "activist scientists," are specious.

"As I pointed out in the post on his article, his own data show that religion makes "liberals" as well as "conservatives" a tiny bit more likely to pick a lower level of "confidence" response in relation to "those who run" science. "

Which is not mutually exclusive with the possibility that the growth of the religious right (and libertarian-tinged distrust of public and governmental institutions more generally) have created a differential trend in how Republicans respond to a poll asking them about "trust in science" when compared to liberals and moderates.

"I'm sure religious right is responsible for all sorts of things, including attitudes toward science-informed policies of various sorts."

Well, that actually is what I'm saying.

Gauchat's findings may also reflect clashes between supporters of conservative ideology and particular areas of science. The rise of the Christian right, which rejects evolutionary theory and opposes research on embryonic stem cells, may have been particularly important – Gauchet found a similar decline in trust in science among frequent churchgoers. More recently, conservatives have come to fear that global warming will provide an excuse for "big government" to restrict their personal freedom through environmental regulation.

http://www.newscientist.com/article/dn21640-us-scepticism--its-been-a-long-time-coming.html#.U17sHFcWPDE

and

One possible interpretation, supported by a growing number of studies, is that social factors such as race/ethnicity, income, religiosity, social capital, and political identifications are at least as important as knowledge and education in predicting trust in science (Gauchat
2008, 2010; Sturgis and Allum 2004; Yearly 2005).

Now I don't agree with Gauchat's conclusions about trends in "trust in science," but I do think that the factors he discusses are relevant to differences in how conservatives, liberals, and moderates respond to poll questions about their level of trust in science.

April 28, 2014 | Unregistered CommenterJoshua

"It is one of the single-most un-skeptical arguments I have run across at those blogs, and is a big reason why I put skeptics in quotation marks."

I think perhaps you are taking casual observations and impressions in conversation too seriously. I'm not sure what sort of evidence you're expecting - are you expecting people to wander round all their friends and neighbours with a clipboard and do a survey before making any assertion about 'what people think'? Do you think the typical sceptic is even logistically *capable* of doing a nationally representative opinion survey with a statistically valid design, or knows the difference between cross-sectional and longitudinal studies? Or cares?

Climate sceptics mostly regard themselves as ordinary members of the public. They've perhaps talked or joked about it with people they know. They'll likely generalise from their own experience, like most people do.

And I don't know for sure. I do know that five years ago I sometimes used to hear 'ordinary' people in everyday life talk about it - expressing their belief, proposing actions to supposedly address it - but that nowadays I never do. Nobody talks about it, nobody does anything about it, if you bring it up in conversation people just look bored, or irritated. Nobody's interested. I hear the occasional cautious joke about it now, where once such a thing was taboo. It's a cliché, a pop-culture reference people know but don't care about any more. Maybe if you ask them, they'd still say they believe, but they don't act as if they do. The vast majority of people I know I don't even know what their opinion on the subject is.

It's not even as if people really know what scientists do think about it. Considering the number of times the 'scientific consensus' has been cited, I still find it quite remarkable how little is known about scientists' real opinions about global warming. So I'm not even sure that people's views on global warming should reflect on their views about scientists. And when people talk about what scientists 'say' about some scientific topic, without so much as a survey to back it up, are you equally ready with your quotation marks?

April 28, 2014 | Unregistered CommenterNiV

@Joshua:

The Gauchat study is an analysis of a single item in the GSS survey. That's his only "finding"; that's why I have been referring to it.

You haven't actually read the article? I know you read the blog post in which I discuss this

April 28, 2014 | Registered CommenterDan Kahan

NiV-

"I think perhaps you are taking casual observations and impressions in conversation too seriously"

No. I'm talking about arguments that "skeptics" make, and defend against critiques that point out, as an example, that you can't draw longitudinal conclusions from cross-sectional data. I had this discussion with Willis when he insisted that the Rasmussen poll explained how over-hyped claims from "activist" scientists were created a crisis in confidence among the public w/r/t climate science specifically and scientists more generally. When someone as smart as Willis makes an argument so lacking in fundamental principles of skepticism that someone like me can see the flaws as obvious, then what we've got is motivated reasoning at play.

"I'm not sure what sort of evidence you're expecting - are you expecting people to wander round all their friends and neighbours with a clipboard and do a survey before making any assertion about 'what people think'?"

I'm expecting that someone who self-identifies as as skeptic, then they should perform fundamental skeptical due diligence when evaluating arguments. I'm expecting that when they look at the data on opinions about "climategate," or about whether the public thinks that scientists falsify evidence, they think skeptically.

"Do you think the typical sceptic is even logistically *capable* of doing a nationally representative opinion survey with a statistically valid design, or knows the difference between cross-sectional and longitudinal studies? "

They don't have to conduct a nationally representative poll to analyze the date from nationally representative polls that have been conducted with a skeptical eye.

"They'll likely generalise from their own experience, like most people do."

So here's the problem. Almost invariably, when I query "skeptics" about the possibility that their arguments are more projections of their own opinions rather than clear-eyed consideration of the evidence, they reject that possibility (along with insult me, ridicule the concept of motivated reasoning as psycho-babble, etc. Sure, most people generalize about their own experiences, but some are willing to consider the data when asked to do so to reflect about whether their generalizations are valid. Should "slkeptics" be any different than anyone else? No, I don't see why they should - but then the label of being a "skeptic" is misapplied.

"And when people talk about what scientists 'say' about some scientific topic, without so much as a survey to back it up, are you equally ready with your quotation marks?"

Absolutely.

April 29, 2014 | Unregistered CommenterJoshua

" I had this discussion with Willis when he insisted that the Rasmussen poll explained how over-hyped claims from "activist" scientists were created a crisis in confidence among the public w/r/t climate science specifically and scientists more generally."

I'm not in a position to make a judgement, since I don't know what discussion you're talking about. Do you mean that time at Climate Etc. when Willis was laughing at Keith Kloor's statement "Climate science will survive this latest viewing of its dirty laundry, because it is a highly reputable field with a proven track record"? I think you're argument then was that the word "reputable" referred to public opinion, rather than the interpretation Willis (and I think Keith) put on it of *deserving* that reputation. I think somewhere along the line you said their reputability was strong among the public and Willis used a Rasmussen poll to say it wasn't. Maybe it wasn't that discussion, since Willis there doesn't seem to be doing what you said he did.

"Should "skeptics" be any different than anyone else?"

No - in particular, they shouldn't be expected to be infallible, either. Everyone has cognitive biases. There are a range of useful techniques for mitigating them, like systematic scepticism, but the most valuable technique is to find somebody with *different* cognitive biases and discuss the matter with them. They can often point out the other viewpoint, or point you to missing information. But people vary in their skill at it, and while some do try to be systematically sceptical about their own side as well as the opposition (Willis is well noted for it) there are plenty of others who are not so much. They're 'climate sceptical', but not 'general sceptical'.

And they're frequently not patient with people who don't see the world as they do - especially when they appear to be making extremely strained attempts to avoid what seems to them like an obvious conclusion. While your calls for them to see both sides of the story are no doubt laudable and good advice, it isn't received well when it comes from somebody who doesn't seem to be following it. (Whether or not you are, that's how it appears.)

Nevertheless, debate for the sake of mitigating one's own biases isn't about winning, or persuading the other person you're right, or even getting the other person to be more open minded. (Although those are all nice to have.) It's really about your own intellectual voyage.

So what I learned from reading your comments was that there is an ambiguity in what Keith said that needs clarifying. He might not have been saying that climate scientists were all fine upstanding citizens who would never make up data. He might have been saying that while they did clearly make up data, that the general public did not and would not believe it, or hear about it, and they could therefore survive the scandal. I personally think such an attitude (when used as a defence) is even worse than honestly believing them to be innocent, but it's a point, and one I hadn't thought of.

In the same sort of way, I find a lot of your comments interesting, useful, even enlightening sometimes. But I do still think that you take casual comments too seriously, and that you demand unrealistic impartiality standards of sceptics that you don't apply consistently to all parties. Blog comments aren't written, or generally held, to the same standard as scientific results being used to direct the multi-trillion-dollar global economy and (ostensibly) to save the world.

April 29, 2014 | Unregistered CommenterNiV

NiV -

"Maybe it wasn't that discussion, since Willis there doesn't seem to be doing what you said he did."

Here is one of instances I was referring to - and it was a follow-up from an earlier occasion (I haven't been able to put my finger on it).

http://judithcurry.com/2013/07/17/certainly-not/#comment-348181

I only mentioned that particular instance because it was a stark example of someone who considers himself to be a highly skeptical thinker, with statistical chops, making a fundamental statistical error rooted in an extremely unskeptical generalization from his own opinion. But I've encountered the same misappropriation of that Rasmussen poll a number of times, and it seems to me to be the kind of non-skepticism I see frequently - not only from "skeptics," of course, and I would argue no more frequently among "skeptics" than among "realists" (as is predicted, IMO, by the shared cognitive and psychological attributes of "skeptics" and "realists" alike). /Google will easily demonstrate how often "skeptics" refer to that poll in ways that are entirely un-skeptical.

"He might have been saying that while they did clearly make up data, that the general public did not and would not believe it, or hear about it, and they could therefore survive the scandal. "

Well, one thing that the poll does indicate (if you consider the methodology valid), is that a fairly large % of the public do think that (at least some) climate scientists skew data (at least some of the time). But perhaps the reason why that wouldn't be particularly meaningful (that they would "survive it" ) is that they might think climate scientists skew data but at the same time think that they are less likely to skew evidence out of self-interest than priests, or plumbers, or doctors, or " climate skeptics." That has been another way that I have encountered a number of "skeptics" who are willing to use that poll to generalize from their own opinions - because they use it to judge climate scientists, relatively, w/o having an comparative data.

"But I do still think that you take casual comments too seriously, and that you demand unrealistic impartiality standards of sceptics that you don't apply consistently to all parties. "

Well, I do often get the same response from "realists."

April 29, 2014 | Unregistered CommenterJoshua

As a scientists, I actually trust scientists less. The problem with this discussion is that, even though people trust scientists, they are against research that scientists do. For instance, liberals and young people are against animal research. If liberals get their way, all biomedical research would end.

April 29, 2014 | Unregistered CommenterMatt

"As I use the term, “the science communication problem” refers to the failure of valid, compelling, and widely accessible scientific evidence to dispel persistent cultural conflict over risks or other policy-relevant facts to which that evidence directly speaks. "

I agree, the complete lack of correlation between Increasing co2 concentrations and no change in the current worlds temps the last decade or so shows the " failure of valid, compelling, and widely accessible scientific evidence " to support an alarmist position on CAGW. Note that the "C" is what is being argued about in the climate wars. Note also that the IPCC itself says that all world temp rise prior to about 1950 is natural, so only the change in rate after this date counts when calculating trends.

Correlation can not imply causation, but lack of correlation does emphasizes that a correlation between two variables does not necessarily imply that one causes the other. My favorite example from my stats classes was the very high correlation between ice cream sales and the crime rate. Both are affected by temp, and neither has anything to do with the other.

April 29, 2014 | Unregistered CommenterEd Forbes

NiV -

I did find that other link that you were referring to:

"Maybe it wasn't that discussion, since Willis there doesn't seem to be doing what you said he did."

Technically, you are correct - so I stand corrected. On the other hand, I have engaged with Willis where he has argued that the public (in general) has lost trust in climate scientists because they are viewed as frauds (paraphrasing) - even if the link he makes directly to the Rasmussen Poll is less direct than how I described.

April 29, 2014 | Unregistered CommenterJoshua

I'm not sure that we can operate directly by the prescription given in #4 above. I think that it is inevitable that special interests try to create messages that resonate with cultural biases as a mechanism for gaining traction for their own views. This is true for groups as diverse as ALEC and the Obama campaign. The basic politics of starting with energizing (or inflaming) your "base" and then going after "swing" or unmotivated voters is actually quite old. The same goes for product marketing. But only recently have the parties been able to read their Dan Kahan (and others) and better exploit these processes.
I believe that the best we can do is to fight to keep media channels as open as possible, and to give, via education, the public the tools necessary to navigate through these information minefields.

May 17, 2014 | Unregistered CommenterGaythia Weis

PostPost a New Comment

Enter your information below to add a new comment.

My response is on my own website »
Author Email (optional):
Author URL (optional):
Post:
 
Some HTML allowed: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <code> <em> <i> <strike> <strong>