follow CCP

Recent blog entries
popular papers

What Is the "Science of Science Communication"?

Climate-Science Communication and the Measurement Problem

Ideology, Motivated Cognition, and Cognitive Reflection: An Experimental Study

'Ideology' or 'Situation Sense'? An Experimental Investigation of Motivated Reasoning and Professional Judgment

A Risky Science Communication Environment for Vaccines

Motivated Numeracy and Enlightened Self-Government

Ideology, Motivated Cognition, and Cognitive Reflection: An Experimental Study

Making Climate Science Communication Evidence-based—All the Way Down 

Neutral Principles, Motivated Cognition, and Some Problems for Constitutional Law 

Cultural Cognition of Scientific Consensus
 

The Tragedy of the Risk-Perception Commons: Science Literacy and Climate Change

"They Saw a Protest": Cognitive Illiberalism and the Speech-Conduct Distinction 

Geoengineering and the Science Communication Environment: a Cross-Cultural Experiment

Fixing the Communications Failure

Why We Are Poles Apart on Climate Change

The Cognitively Illiberal State 

Who Fears the HPV Vaccine, Who Doesn't, and Why? An Experimental Study

Cultural Cognition of the Risks and Benefits of Nanotechnology

Whose Eyes Are You Going to Believe? An Empirical Examination of Scott v. Harris

Cultural Cognition and Public Policy

Culture, Cognition, and Consent: Who Perceives What, and Why, in "Acquaintance Rape" Cases

Culture and Identity-Protective Cognition: Explaining the White Male Effect

Fear of Democracy: A Cultural Evaluation of Sunstein on Risk

Cultural Cognition as a Conception of the Cultural Theory of Risk

« Hey everybody -- take the cool CCP/APPC "Political Polarization Literacy" test! | Main | Bounded rationality, unbounded out-group hate »
Friday
May062016

Raw data: the best safeguard against empirical bull shit!

The last two posts were so shockingly well received that it seemed appropriate to follow them up w/ one that combined their best features: (a) a super smart guest blogger; (b) a bruising, smash-mouthed attack against those who are driving civility from our political discourse by casting their partisan adversaries as morons; and (c) some kick-ass graphics that are for sure even better than "meh" on the Gelman scale!  

The post helps drive home one of the foundational principles of critical engagement with empirics: if you don't want to be the victim of bull-shit, don't believe any statistical model before you've been shown the raw data!

Oh-- and I have to admit: This is actually a re-post from asheleylandrum.com. So if 5 or 6 billion of you want to terminate your subscription to this blog & switch over to that one after seeing this, well I won''t blame you --  now I really think Gelman was being kind when he described my Figure as merely "not wonderful...."


When studies studying bullshit are themselves bullshit...

We have a problem wth PLoS publishing bullshit studies.


Look, I really appreciate some aspects of PLoS. I like that they require people to share data. I like that they will publish null results. However, I really hope that someday the people who peer-review papers for them step up their game. 

This evening, I read a paper that purports to show a relationship between seeing bullshit statements as profound and support for Ted Cruz. 

The paper begins with an interesting premise: does people's bullshit receptivity--that is, their perception that vacuous statements contain profundity--predict their support for various political candidates? This is a particularly interesting question. I think we can all agree that politicians are basically bullshit artists

Specifically, though, the authors are not examining people's abilities to recognize when they are being lied to; they define bullshit statements as 

communicative expressions that appear to be sound and have deep meaning on first reading, but actually lack plausibility and truth from the perspective of natural science.


OK, they haven't lost me yet. 

The authors then reference some recent literature that has describes conservative ideology as what amounts to cognitive bias (at the least) and mental defect (at the worst).

I identify as liberal. However, I think that this is the worst kind of motivated reasoning on the part of liberal psychologists.  Some of this work has been challenged (see Kahan take on some of these issues). But let's ignore this for right now and pretend, that the research they are citing here is not flawed.

The authors have the following hypotheses:

  • Conservativism will predict judging bullshit statements as profound. (I can tell you right off that if this were mostly a conservative issue, websites like spirit science would not exist)
  • The more individuals have favorable views of Republican candidates, the more they will see profoundness in bullshit statements. (So here, basically using support for various candidates as another measure of conservativism).
  • Conservativism should not be significantly related to seeing profoundness in mundane statements. 
These hypotheses were clearly laid out, which I appreciate. I also appreciate that the authors followed the guidelines and posted their data. Makes post-publication peer-review possible.

Here is one of my first criticisms of the method of this paper. The authors chose to collect a sample of 196 participants off of Amazon's Mechanical Turk. Now, I understand why, MTurk is a really reasonably priced way of getting participants who are not psych 101 undergraduates. However, there are biases with MTurk samples. Mainly that they are disproportionately male, liberal, and educated. Particularly when researchers are interested in examining questions related to ideology, MTurk is not your best bet. But, let's take a look at the breakdown of their sample based on ideology, just to check--especially since we know that they want to make inferences about conservatives in particular.
I don't have the exact wording of their measure of conservativism, but they describe it as asking participants to place themselves on a 7-point Likert scale where 1 is liberal and 7 is conservative. The above table shows the frequency of participants at each level. You can see that the sample is disproportionately liberal/liberal leaning. In fact, if you total the participants in columns 5, 6, and 7 (the more conservative columns), you have a whopping total of 46 participants choosing a somewhat conservative designation versus the 108 participants in columns 1, 2, and 3.

Thus, it is unfair--in my opinion--to think that you can really make inferences about conservatives in general from this data. Many studies in political science and communications use nationally-representative data with over 1500 participants. At Annenberg Public Policy Center we get uncomfortable sometimes making inferences from our pre/post panel data (participants who we contact two times) because we end up with only around 600. I'm not saying that it is impossible to make inferences from less than 200 participants, but that the authors should be very hesitant, particularly when they have a very skewed sample.

I'm going to skip past analyzing the items that they use for their bullshit and mundane statements. It would be worth doing a more comprehensive item analysis on the bullshit receptivity scale--at least going beyond reporting Cronbach's alpha. But, that can be done another time.

The favorability ratings of the candidates are another demonstration of how the sample is skewed. The sample demonstrates the highest support for Bernie Sanders and the lowest support for Trump. 
This figure shows the density of participants rating four of the candidates (the authors also include Martin O'Malley and Marco Rubio) on the 5 point scale. You can see that a huge proportion of participants gave Trump and Cruz the lowest rating. In contrast, People tended to like Clinton and Sanders more. 

Moving onto their results.

The main claim that the authors make is that:
Favorable ratings of Ted Cruz, Marco Rubio, and Donald Trump were positively related to judging bullshit statements as profound, with the strongest correlation for Ted Cruz. No significant relations were observed for the three democratic candidates.
Now, the authors recognized that their samples were skewed, so they conducted non-parametric correlations. But, I'm not sure why the authors didn't simply graph the raw data and look at it.

Below, I graph the raw data with the bullshit receptivity scores on the x-axis and the support scores for each candidate on the y-axis. The colored line is the locally-weighted regression line and the black dashed line treats the model as linear. I put Ted Cruz first, since he's the one that the authors report the "strongest" finding for. 
So, does this look like a linear relationship to you? As the linear model goes up (which I imagine is what drives the significant correlation), the loess line goes down. I imagine that this is a result fo the outliers who score high on BSR. Note that of the ones scoring highest on BSR, the largest clump actually shows low support for Cruz, with one outlier rating him at 4. Indeed, the majority of the sample sits at 1 for supporting Ted cruz, and the samples' BSR scores are grouped in the middle.

You can see similar weirdness for the Trump and Rubio Ratings. The Trump line is almost completely flat--and if we were ever to think that support for a candidate predicted bullshit receptivity, it would be support for Trump---but I digress.... Note, too, how low support is.  Rubio, on the other hand, shows a light trend upwards when looking at the linear model (the black dashed line), but most people are really just hovering around the middle. Like with Cruz, the people with the highest bullshit receptivity (scores of around 5) rate Rubio low (1 or 2).


So, even if I don't agree that your significant correlations are meaningful for saying that support for conservatives is predicted by bullshit receptivity (or vice versa), you might still argue that there is a difference between support for liberals and support for conservatives. So, let's look at the democratic candidates.
I put Hillary Clinton on top, because I want to point out that her graph doesn't look much different from that of Cruz or Rubio. It still has a trend upwards, but not just for the linear model. In fact, where Cruz had one person that gave high ratings for him and had the highest level of bullshit receptivity, Hillary has three.  In fact, let's take a look at Clinton's loess Line and Cruz's loess line mapped on the same figure....
Kind of looks to me like there is a stronger argument for Clinton supporters having more bullshit receptivity. Now, I do support Hillary Clinton, so don't get me wrong. My point is only to demonstrate that significant p values for one type of correlation do not mean that the finding is real.

Here are the figures for Sanders and O'Malley. Again, pretty straight lines. But for the highest level of bullshit receptivity, Bernie indeed has a cluster of supporters--people rating him at the highest level. I do not think that this supports the opposite conclusion to what these authors found--that bullshit receptivity predicts democratic candidate support--but I don't think that the authors should have made the conclusions that they did. These results really just appear to be noise to me.
In conclusion...

The authors *do* list the limitations of their study. They state that their research is correlational and that their sample was not nationally representative. But they still make the claim that conservatism is related to seeing profoundness in bullshit statements.. Oh, which reminds me, we should have looked at that too...
That's a graphical representation of conservatism predicting bullshit receptivity. 

What concerns me, here, is two-fold.

First, despite what p values may or may not be below a .05 threshold, there is no reason to think that this data actually demonstrates that conservatives are more likely to see profundity in bullshit statements than liberals--But the media will love it.

Moreover, there is no reason to believe that such bullshit receptivity predicts support for conservative candidates--but the media will love it. This is exactly the type of fodder picked up  because it suggests that conservatism is a mental defect of some sort. It is exciting for liberals to be able to dismiss conservative worldviews as mental illness or as some sort of mental defect. However, rarely do I think these studies actually show what they purport to. Much like this one.

Second, it is this type of research that makes conservatives skeptical of social science. Given that these studies set out to prove hypotheses that conservatives are mentally defective, it is not surprising that conservatives become skeptical of social science or dismiss academia as a bunch of leftists. Check out this article on The Week about the problem of liberal bias in social science

If we actually have really solid evidence that conservatives are wrong on something, that is totally great and fine to publish. For instance, we can demonstrate a really clear liberal versus conservative bias in belief of climate change. But we have to stop trying to force data to fit the view that conservatives are bad. I'm not saying that this study should be retracted,  but it is indicative of a much larger problem with the trustworthiness of our research.

PrintView Printer Friendly Version

EmailEmail Article to Friend

Reader Comments (23)

Second, it is this type of research that makes conservatives skeptical of social science.

Actually, I think that statement is bullshit...

...well, somewhat bullshit.

I think that the cause-and-effect on why conservatives are skeptical of social science is much more complicated than you describe.

Perhaps the kind of research you are talking about is one reason that some conservatives are skeptical of social science, but there are also other influences, and I'd say it's likely some of those other factors have a significantly stronger effect.

May 6, 2016 | Unregistered CommenterJoshua

If we actually have really solid evidence that conservatives are wrong on something, that is totally great and fine to publish. <Strong>For instance, we can demonstrate a really clear liberal versus conservative bias in belief of climate change.

Just because NiV hasn't commented yet, I'll pipe in here also.

There is some irony in that comment, as many conservatives will argue that your statement about conservatives being; wrong on climate change reflects the same kind of embedded bias that the article you linked discusses...

For example, a study that sought to show that conservatives reach their beliefs only through denying reality achieved that result by describing ideological liberal beliefs as "reality," surveying people on whether they agreed with them, and then concluding that those who disagree with them are in denial of reality — and lo, people in that group are much more likely to be conservative!

That isn't to say that I don't agree that many conservatives' views on climate change displayed biased reasoning - but that pointing to climate change as an example of where conservatives are disproportionately more influenced by ideological orientation than liberals is probably not a good idea. The vast majority of folks on both sides of the climate wars come to their views on climate change via identity-protective reasoning.

May 6, 2016 | Unregistered CommenterJoshua

"Second, it is this type of research that makes conservatives skeptical of social science"

This is true, but you don't have to be conservative to see the bias and bullshit in these sorts of papers and hence acquire a low regard of the field in general.

May 6, 2016 | Unregistered CommenterPaul Matthews

Paul -

==> This is true,

So, as I recall, you often say. Where is your evidence, beyond the anecdotal, and extrapolating from your own experiences (as an outlier)? My sense is that you often generalize from the views of a tiny minority, an unrepresentative sample.


It wouldn't be very skeptical to make such a claim, with such absolute certainty, if you had no evidence - so I'm sure you have some. Please provide a couple of links?

What data do you use to show that there is a significant effect, and more than, say, the number of conservatives gaining faith in the social sciences because of the research that shows valid results, or results that have negative implications for liberals, or results that show no significant differences between liberals and conservatives, or research that finds positive traits associated with conservatives?

How do you associate the patterns in how conservative feel about social science research from such situations as where large swaths of the conservative public rejected the analysis of public health experts in how to respond to the "Ebola crisis," and instead thought we should craft public policy on the basis of the analysis of politicians such as Christ Christie and Donald Trump? So in one situation the judge the entire field of social science based on some potentially poor research, and in the other situation they trust the views of the completely uninformed even when it stands in contrast to medical evidence?

I believe that I have asked you quite few times for the evidence on which you base these confident and certain conclusions. Yet I can't recall you providing any.

Is there a reason for that?

May 6, 2016 | Unregistered CommenterJoshua

Your comments are bizarre Joshua. It's not my statement, it's Landrum's. I'm just agreeing with it. You ask for links, but she provides a link to support it, which is staring you in the face! And that contains further links. You are right that I usually ignore your comments, and there's a good reason for that. In fact I don't usually even bother to read them, and will continue this policy.

May 6, 2016 | Unregistered CommenterPaul Matthews

==> Your comments are bizarre Joshua. It's not my statement, it's Landrum's. I'm just agreeing with it.

And I questioned what evidence you use to support your agreement. As I recall, in the past you have extrapolated from views expressed by outlier "skeptics" online to generalize about the public's views on climate change...and when I have asked for evidence to support your assertions, you have failed to provide any.

==> You ask for links, but she provides a link to support it, which is staring you in the face!

Her link doesn't support that statement. Did you read it?

The link she provided relates to evidence about potential bias in the social sciences. Even if I were to accept that article's thesis on face value, that doesn't address the claim made in the post here about the cause-and-effect behind why conservatives are skeptical about social science research (which was basically just an argument by assertion) and it certainly doesn't suffice as evidence for your certain conclusion about that cause-and-effect.

==> And that contains further links.

Primarily to the Haidt article, which likewise does not provide evidence for the claim made in the post here that I questioned, nor for your certain conclusion about that claim.

==> You are right that I usually ignore your comments, and there's a good reason for that.

Certainly that's your prerogative. And likewise, attacking me and pointing to empty justifications (links that are on a related but different topic) for your statement of fact is also your prerogative.

Or, instead, you could choose to explain what evidence you use to draw your certain cause-and-effect conclusions about the reasons why conservatives are skeptical of social science.

Perhaps your failure to provide any comments as to the evidence you use to draw your assertion (that her argument by assertion is "true") indicates that you don't have any supporting evidence, and perhaps not. Only you can know for sure.


But either way, it would seem that attacking me is a rather poor substitute for providing evidence. But if that works for you, it works for you.

May 6, 2016 | Unregistered CommenterJoshua

@Joshua & @PaulMathews--

It is sort of study that ought to make people suspicious of just about everything in social science journals. This is standard "my p-value is < 0.05 so I get to say whatever I want, who cares if the effect I observed doesn't support the inference I am drawing" fare.

But for it is what makes conservatives suspicious-- due to their selective attention to publication of awful NHT studies.

May 7, 2016 | Registered CommenterDan Kahan

The questions asked can be more illuminating than the answers.

In this instance the authors actually name real people in the title who are all from one side of the political spectrum, so the authors choice of topic is clearly unambiguous. And the negative aspect of the attributes being investigated is also remarkably open.

I'd be interested to know more about the frequency and nature of studies that specifically target "conservative" bias as opposed to "liberal" bias, or those attempting to appear more even-handed. Any insights, Dan?

May 7, 2016 | Unregistered Commentermichael hart

Dan -

=>> "But for it is what makes conservatives suspicious-- due to their selective attention to publication of awful NHT studies."

That was kind of my point. If they're paying selective attention to particular studies in order to confirm a bias, then their skepticism about social science precedes their reading the study. Kind of like most ""skeptics" who find rationalizations to explain that "skeptics" as a group are "skeptical" about climate change because they don't buy technical and arcane details about scientific research. That wouldn't explain the strong association with right wing ideology, especially since the vast majority of "skeptics" have no idea what the research actually says. Some "skeptics" explain their own "skepticism" on their having read technical material and finding it of poor equality. Well, even if when that is true in their individual case (I think in many cases it is likely to be revisionist history), to use that explanation for the"skeptical" public is generalization from an unrepresentative sample and ironically, entirely un-skeptical.

May 7, 2016 | Unregistered CommenterJoshua

"Just because NiV hasn't commented yet, I'll pipe in here also."

Thanks, Joshua!

I did notice the comment in passing and I my first thought on it was pretty much what you evidently expected. But I'd have probably passed it by, since there's so much of value in the rest of the post. It seems too nakedly partisan to read a great article with one tiny partisan shot in it, and to zero in on the bit I didn't like. I assume it was just put in for "balance" - having criticised liberals, you have to criticise conservatives as well or the liberals will all just reject your entire argument as a manifestation of latent conservatism. As such, it was probably a bit weaker than it really needed to be - if I can recognise it as a bit of blatant ideological purity signalling, I'm sure so can the liberals.

But yes, it's still ironic. :-)

"Kind of like most ""skeptics" who find rationalizations to explain that "skeptics" as a group are "skeptical" about climate change because they don't buy technical and arcane details about scientific research. That wouldn't explain the strong association with right wing ideology, especially since the vast majority of "skeptics" have no idea what the research actually says."

How many times have we discussed this? :-)

People are more inclined to check arguments if their conclusions go against their prior beliefs.
That subset of them with greater scientific literacy are more likely to find flaws in those arguments, or find experts they trust who have found flaws, and reject the claims.
Their rejection is based on the technical counter-argument, not simply that they "don't like" the conclusion, or that they'd lose the respect of their social circle if they admitted it.
We can tell this because those without the ability to find the counter-arguments (on their own or from others) are not as inclined to reject the uncomfortable conclusion, although still subject to the same social pressures. If it was simply a matter of fitting in with your social circle, the scientifically illiterate would be as biased.

Self-reported explanations of motivation are not entirely to be relied upon, but they're not entirely valueless, either. It would explain why people on *both* sides of the argument think they're being properly scientific and sceptical in forming their beliefs. I'm not sure why you're so determined to reject them.

The vast majority of *everybody* has no idea what the research actually says. And yet we have a strong association of climate catastrophe belief with left-wing ideology to explain. So why do believers believe what they do? Because they've paid attention to the science? (Obviously not.) Because they're better at recognising trustworthy scientists? (How? Why?) Or because they're politically biased, and trying to fit in with their own crowd, and saying what they're expected to say?

It's ironic. Social scientists studying political bias are one of the best examples of political bias going. But it's true for anyone: you yourself are always the hardest person to be dispassionate and objective about.

May 7, 2016 | Unregistered CommenterNiV

@MichaelHart: My sense is that nearly all the studies that examine relationship of cognitive styles & ideology either (a) purport to show that being conservative is associated with some reasoning deficiency that impedes unbiased evaluation of evidence or (b) present evidence showing that hypothesis is ill-supported. (It is really easy to find bad category (a) studies that are just a mess...) I don't recall scholarship purporting to show "liberals are the stupid ones!," although there is plenty of that sort of stuff in the blogosphere.

I've written about the "asymmetry" thesis (motivated reasoning is concentrated at right end of ideological spectrum or some equivalent space in some cognate scheme for operationalizing motivating disposition) 1.39 x 10^9 times; latest being here.

But if I've missed something, I hope of the 14 billion readers of this blog will tell me!

May 7, 2016 | Registered CommenterDan Kahan

NiV -

==> I assume it was just put in for "balance" - having criticised liberals, you have to criticise conservatives as well or the liberals will all just reject your entire argument as a manifestation of latent conservatism. As such, it was probably a bit weaker than it really needed to be - if I can recognise it as a bit of blatant ideological purity signalling, I'm sure so can the liberals.

I'll let you to speculating about motives or intent.


==> People are more inclined to check arguments if their conclusions go against their prior beliefs.

I would phrase it differently: When people run against conclusions that go against their prior beliefs, they are "motivated" to apply their ideologically-oriented filter, so as to not disconfirm their prior beliefs. They can do that in myriad ways. One way is to find reasons to dismiss the conclusions by diminishing the "concluder." Another, related way, is to find ways to convince themselves that the conclusions they disagree with are influenced by biases on the part of others. We see examples of these methods of avoiding disconfirming beliefs all the time in the "skept-o-sphere" and pretty much everywhere else.

==> That subset of them with greater scientific literacy are more likely to find flaws in those arguments,...I doubt that. I would guess that they're just likely to find different kinds of flaws...i.e., flaws that fit comfortably with their sense of identity. For example, non-scientifically literate "skeptics" might be content with dismissing the work of climate scientists because "Al Gore is fat," whereas a more scientifically-literate "skeptic" might choose something like "aCO2 is a trace gas."' Someone in the middle might go with "But they adjusted the temperatures."

==> ...or find experts they trust who have found flaws, and reject the claims.

I would say...or find people with some technical background (or who at least have a patina of such), who reject the claims and determine that that those people are experts to be trusted. Your phrasing begs the question of determining how they determine which "experts" to trust.

==> Their rejection is based on the technical counter-argument, not simply that they "don't like" the conclusion, or that they'd lose the respect of their social circle if they admitted it.

This seems to assume that these processes are running at a fully conscious level. I would guess that they aren't. It isn't as if someone says, "Hmmm. Let me find someone who disagrees and then I'll argue with plausible deniability that they are an expert." And neither is it simply, "Hmmm, that information runs counter to my beliefs, so let me find some 'expert' who I can trust based on completely objective criteria, and see what they have to say about the technical components."
As I said above, the choice of which expert to trust is not independent of identity-related belief systems. It isn't as if technical expertise is assigned in some objective and identity-independent fashion.

As near as I can tell, your portrayal pretty much ignores the basic mechanism of motivated reasoning/cultural cognition.


==> We can tell this because those without the ability to find the counter-arguments (on their own or from others) are not as inclined to reject the uncomfortable conclusion, although still subject to the same social pressures. If it was simply a matter of fitting in with your social circle, the scientifically illiterate would be as biased.

I don't agree. First, there are more than just "social pressures" in play. The "motivations" behind MR or cultural cognition, IMO, are a grab-bag of social pressures and individual-level pressures. People have a psychological need to confirm that they are right.

In the post downstairs, I quoted from a review of Haidt's book:

Haidt approvingly quotes Phil Tetlock who argues that “conscious reasoning is carried out for the purpose of persuasion, rather than discovery.” Tetlock adds, Haidt notes, that we are also trying to persuade ourselves. “We want to believe the things we are saying to others,” Haidt writes. And he adds, “Our moral thinking is much more like a politician searching for votes than a scientist searching for truth.”

I don't agree with a main thrust of Dan's argument, that weights almost completely the pressure to conform to a "group." There is another, closely related pressure in play. IMO, with cultural cognition - the psychological need to confirm one's identity to oneself: as someone who is right, as someone who doesn't have to reject major tenets of his group's social identity. The causation is more multi-factorial, IMO.

And that belief on my part leads to a 2nd reason why I think your mechanism scenario isn't sufficient: people who develop scientific literacy skills have a different set of motivators than those who don't. The condition of being scientifically literate isn't merely a product of random selection, or genetic disposition. It is associated with a variety of social and cultural influences. Thus, someone who is more technically literate has more of a motivation for rejecting conclusions in a technically-related area. My guess is that people (at least in the U.S. - my guess is that the scientifically-literate/greater polarization about climate change association that Dan points to probably takes something of a different shape in other cultural contexts) who are more technically-oriented are at least to some degree more motivated to defend (or aggress against) opposing technical viewpoints, and particularly in heavily politicized contexts.

In other words, those who are less-scientifically literate care less about confirming their identity through arguing about climate change. They might have a different set of skills for confirming their identity if they cared as much (they could just say that "Al Gore is fat" rather than "aCO2 is a trace gas")...but it isn't that they are less skilled at confirming their identity by rejecting opposing views.


==> Self-reported explanations of motivation are not entirely to be relied upon, but they're not entirely valueless, either.

I wasn't suggesting that they are completely valueless.

==> It would explain why people on *both* sides of the argument think they're being properly scientific and sceptical in forming their beliefs. I'm not sure why you're so determined to reject them.

I haven't "rejected" them. I am dubious about self-report bias. Here's an example...I remember reading a study about the impact of "Climategate" on public opinion. First, it described how only a minority of people knew much at all about "Climategate," and then of that subset, only a small % knew many of the details. Then, of that subset, some subset reported that "Climategate" reduced their concern about climate change while a smaller subset said it increased their concern about climate change. Then, lo and behold, there was an associated set of political beliefs for each of those subsets, respectively. My assumption then, rather than thinking that people's self-report was valid, was to say that we simply don't know whether a pre-test/post-test analysis of whether people's views actually changed as a result of "Climategate" or whether people (probably unconsciously) used "Climategate" as a method for confirming their pre-existing biases. But I have yet to see a "skeptic" accept that such a process might have taken place, as they go on to (without validated evidence) assert that "Climategate" caused a "crisis" in confidence in climate science and indeed, contributed significantly to an overall crisis in the public's confidence in science (and scientists and scientific institutions).


==> The vast majority of *everybody* has no idea what the research actually says. And yet we have a strong association of climate catastrophe belief with left-wing ideology to explain.

Well, I don't know exactly what you mean by "climate catastrophe belief," so it's a bit difficult to respond to that. Does it mean a belief that we are incurring risk with BAU, or does it mean that we are certainly all going to die w/o immediate cessation of aCO2 emissions, or something in between, or both?

But beyond that, I don't think that the explanation is very difficult at all. To a large degree, the ideological split in public views on climate change is explainable by identity-group orientation. It certainly isn't because the bulk of the public has studied the science with the skills needed to interpret the science.

==> So why do believers believe what they do? Because they've paid attention to the science? (Obviously not.) Because they're better at recognising trustworthy scientists? (How? Why?) Or because they're politically biased, and trying to fit in with their own crowd, and saying what they're expected to say?

Again, I think that the causal mechanism you describe isn't sufficiently comprehensive (it isn't only "fit[ing] in with their own crowd," IMO)....but I'm not sure why you're asking that question of me. I think I've made it quite clear from our (many!) previous discussions about this that I attribute a great deal of the ideological split in views on climate change to motivated reasoning/cultural cognition... I will note (as I have before) that I don't think it's as simple as:

ideology --> views on climate change,

but that:

Pattern-recognition and psychological needs --> identity-protection and defense ---> motivated reasoning/cultural cognition ---> views on climate change.


==> Social scientists studying political bias are one of the best examples of political bias going.

I don't agree. The scientific method is one, important, hedge against political bias. It doesn't eliminate political bias, and we can't assume that everyone utilizing the scientific method is better at controlling for bias than everyone who uses other reasoning processes, but it is a useful hedge. And there is, of course, the possibility that social scientists by nature are more heavily "motivated" to confirm political identity (which is something that would be consistent with my earlier arguments, actually) than non-scientists or perhaps even physical scientists...

But I do have some degree of faith in the scientific process and think that in the end, the aggregate output of social scientists is less influenced by political bias than the aggregate output of those who don't even try to apply the scientific method to their reasoning processes. Of course, trying to measure that would be incredibly difficult and I think largely a waste of time.

May 8, 2016 | Unregistered CommenterJoshua

NIV & Joshua:

Regarding my comment about conservatives and climate change, I merely meant to point out that there *IS* a clear relationship between conservative ideology and belief in human-caused climate change. The correlation can be seen in raw data, unlike this authors statement of the relationship between conservative ideology and bullshit receptivity. Do you argue that this relationship among ideology and climate change belief doesn't exist? Because I have access to a ton of data from various surveys that demonstrate this and I"m sure that Dan has posted on this before.

May 8, 2016 | Unregistered CommenterAsheley

I've been enjoying reading the series of recent posts and guest posts. I've thought of various comments but never had occasion to post them. I've been being traveling, with significant time spent stuck in transit in and out of terminal F at the Philly airport and the rest of the planet. But fortuitously, the material I brought along with me had to do with water law and climate and was low on differential equations or chemical formula! http://www.npr.org/2016/05/08/477257942/a-man-scribbles-in-a-notepad-terrorist-plot-or-simple-math-problem

Reading that made me both annoyed with our TSA police state mindset, but also fascinated by the research work of the individual who was victimized in this case, Guido Menzio:"Menzio says ironically his work revolves around something called search theory. He studies how much information someone should gather before making a decision. He says the decision to turn the plane around was made before anyone even spoke to him.[!!!!] The episode, Menzio writes, reveals a need to redesign some aspects of air travel security." His prior work seems to have to do with employment, http://web-facstaff.sas.upenn.edu/~gmenzio/research.htm but I'm wondering if his recent experiences haven't made him interested in other matters of public policy. And there he is, conveniently located in the real world portion of Philadelphia! I'd like to nominate Guido Menzio for a guest post here. Dan Kahan, I'd be very appreciative if you would be willing to make the effort to contact Menzio and make such a request.

Here, I am very impressed by the work done by Asheley Landrum . It is not true, as Dan's headline states that merely having the data is a safeguard against empirical bullshit. It is also necessary for someone to step forward, do the work of analyzing it, and then do their best to bring the matter to public attention. Kodos to Asheley Landrum for being that person! PLOS deserves some credit for providing a forum on which this sort of analysis is possible. In theory this ought to lead to something better than the closed door quiet adjustments that such instances have brought in academia and non public access journals in the past. But so far PLOS seems to have given only a defensive, and not data driven response online: http://journals.plos.org/plosone/article/comment?id=info%3Adoi%2F10.1371%2Fannotation%2F1deff086-7798-4bae-abab-a5a71dc48122. And where is the rest of the social science community? I find this disappointing, but perhaps this process takes more time.


I'd only partially agree with Asheley Landsdrum that: “I think we can all agree that politicians are basically bullshit artists." The candidates I support most strongly are not like that, my candidates are actually telling it like it is. Political candidates do rely on a shorthand or memes. These are things that resonate with those of us who are trying to do a quick sort on “are they with us or against us?”. I used to think that the public of yore, pre- TV and Internet, was blessed with a mammoth attention span, and thus able to listen raptly at lengthy events like the Lincoln-Douglas debates. I've since learned that these were more like carnivals, with the audience wandering off to get a bite to bite or drink and then rushing back if it sounded like things might be getting really interesting. This makes me feel better about my own debate viewing habits. I feel that I know a lot about a few of my favorite candidates, but some I'm willing to vote for, not on a lengthy match-up of possible issues, but a check list of a few key ones. Is the person (mostly) with me or against me, or at least likely to be a better person to support than the alternative?

In order to reach people with short attention spans, to fit onto 30 second spots, and also to not be misquoted, politicians learn to speak in sound bites, and also to list affiliations, not just political parties but also other groups that might generate a base of followers. They need to do this while also being cognizant of the need to acquire the necessary funding. All of this used to be accomplished by what are often called “pivots” between appeals to various bases and a more general position. This was easier to accomplish in the days before video replays were possible. Ok, Bullshit artists have an advantage here.

I think that this is where it would be enlightening to have Guido Menzio weigh in. How much information should we think we need to make a decision?

May 9, 2016 | Unregistered CommenterGaythia Weis

Asheley -


==> I merely meant to point out that there *IS* a clear relationship between conservative ideology and belief in human-caused climate change.

As there is a clear relationship between liberal ideology and belief in human-cause climate change.


==> Do you argue that this relationship among ideology and climate change belief doesn't exist?

Certainly not.

May 9, 2016 | Unregistered CommenterJoshua

Re: Asheley Landrum's most recent comment: I think that it ought to be considered that climate change "belief" is not so much about climate change, but has rather to do with underlying attitudes towards development, "progress" and bearing the social costs of economic development.

I was recently contemplating all this while at a public energy forum, in Longmont, Colorado, sponsored by "Flatirons Responsible Energy Action" a Republican/industry pro-fracking group. My new hometown, the City of Longmont passed an anti fracking ordinance, recently largely nullified by a decision of the Colorado State Supreme Court. http://www.timescall.com/longmont-local-news/ci_29839751/colo-supreme-court-strikes-down-longmont-fracking-ban One of the forum participants was our Congressional Representative, Republican Ken Buck. Ken is usually portrayed as a climate change denier. What he said at this meeting was that he wasn't against the idea that the climate was warming, he just had "serious doubts" that it was human caused. He then immediately segued into an argument about fossil fuels, pointing out that naysayers had thought that we would run out of oil after Prudhoe Bay, but now look at us, with fracking we have more oil and gas than we know what to do with! As he pointed out, the naysayers were wrong. And he indicated that there will always be something.

This fits with previous posts here in which it is demonstrated that if presented with positive potential technological solutions, like geo-engineering, then people are more likely to say they believe in anthropogenic climate change.

There is, in European, "Western" culture a strong thread of develop now, deal with the consequences later. And that in so doing, our culture will have and further develop the technological and economic strengths needs to move forward.

This contrasts with attitudes of other cultures, such as many Native American groups, who take a longer term outlook towards sustainability:

"The Peacemaker taught us about the Seven Generations. He said, when you sit in council for the welfare of the people, you must not think of yourself or of your family, not even of your generation. He said, make your decisions on behalf of the seven generations coming, so that they may enjoy what you have today."

Oren Lyons (Seneca)
Faithkeeper, Onondaga Nation

http://www.7genfund.org/oren-lyons

In the case of Eastern Colorado and Ken Buck, I think that the case of Climate Change has some interesting parallels to the origins of extensive white settlement in the area. This came after the genocide at Sand Creek: https://www.nps.gov/sand/index.htm. But also after the scientific expeditions of John Wesley Powell, whose 1879 work, Report on the Lands of the Arid Regions of the United States" is a profoundly scientific work on water availability and agricultural potential. Mostly about the limitations thereof, and still informative today. http://pubs.usgs.gov/unnumbered/70039240/report.pdf

But much of the plains, including Eastern Colorado, was actually settled on a philosophy of "Rain Follows the Plow". This had some optimistic ideas about bioengineering. Some homesteads could be "proved" simply by planting trees. On the theory, (which does work in reverse with deforestation in the Amazon) that planting forests would support moisture. As noted here: http://www.wired.com/2014/06/fantastically-wrong-rain-follows-the-plow/

"The spread of the myth, according to Henry Nash Smith in his essay “Rain Follows the Plow: The Notion of Increased Rainfall for the Great Plains,” can be largely blamed on America’s most reliable villain: greed. It was those standing to profit from a prosperous West—the real estate speculators, the railroads, and the politicians—who propagated the tale, for “the pressures making for high estimates of the economic potentialities of the plains were strong and varied,” Smith writes."

Ken Buck would have felt quite comfortable with this, I think.

Rain follows the plow bit the dust in the 1930's Dust Bowl. There were some attempts at native grasslands restoration, but by and large the economies of this area were rejuvenated by the invention of circle pivot irrigators and deep wells to the Ogallala aquifer. Look at a satellite map of Eastern Colorado and adjacent areas of Nebraska and Kansas and what you see are the green "lillypads" of circle irrigated fields, mostly corn. The Ogallala aquifer is being significantly overdrafted and is therefore now being depleted. Then what? Oil and gas fracking is a new lifeline. Some people dream of a pipeline from the Mississippi or points north in Canada. Both of which would have to be considerably uphill. Most who can, I believe, will sell out for what they can get of the big bucks and move on.

The driving force isn't antipathy to science, in my opinion, it is short term greed.

Climate change will have winners and losers. The residents of my former home had an unseasonably warm and dry day in a warm and dry spring to celebrate their tribal alliances by lining up on opposite sides of the road from each other, for the occasion of a Trump rally. http://getwhatcomplanning.blogspot.com/2016/05/a-prim-and-proper-trump-protest-in.html. Lyndon is the center of Whatcom County's dairy and berry business. While there are labor tensions, almost all of the workforce is immigrant, and largely of indigenous Mexican heritage. http://crosscut.com/2016/04/west-coast-farmworkers-gear-up-for-a-long-fight/. Lynden has a Dutch heritage, they are the descendants of clever immigrants who realized that the "worthless swampland" left behind after extensive cedar forest logging would make excellent farmland. That era's version of there's always something. But these days, many of the fields are owned by the Sikh community. Dairy products, along with the antique shops in Lynden's downtown, are dependent on border crossing Canadian customers. Building walls makes no sense, except as an expression of anger and resentment.


Somehow while we are lining up on opposite sides of roads, talking about walls, London just elected a Muslim mayor http://www.bbc.com/news/election-2016-36232392.

Canada managed to get Trudeau after Harper. Do we have to go through some awkward transitional phase first?

Ashley Landrum's website notes that: Dr. Landrum’s work focuses on two interrelated themes underlying social learning: learning about other people as communicators of information (learning to trust) and leveraging that knowledge to learn from others (trusting to learn).

Sounds great, and I look forward to learening more and helping to figure out how to get there.

May 9, 2016 | Unregistered CommenterGaythia Weis

As there is a clear relationship between liberal ideology and belief in human-cause climate change.

Is it really that clear, Joshua? I understand you're trying to make a symmetry argument between (x) and (1-x), but isn't this the time to pull out the Lewandowsky paper comparing Australians to Americans and use their observation that political will to climate action prevails in people who aren't identity-committed to the free market to break the symmetry?

People are more inclined to check arguments if their conclusions go against their prior beliefs.

Quoting this for truth, just so NiV isn't the only one saying it here. I can tell everyone that's exactly how my own brain works; I'm very conscious of this process when I'm doing it. And in general, I've yet to hear an actual argument against the rationality of this approach in an attention-limited environment.

May 9, 2016 | Unregistered Commenterdypoon

dypoon -

I couldn't quite follow what you were saying...but I don't see how saying that there is an association between conservative ideology and beliefs on climate change doesn't pretty much necessarily imply that there is an parallel association on the other side. And certainly there is evidence, accordingly.

It is true that some data show that "independents," and even Republicans, are closer to Dems on the issue of climate change than Tea Partiers are to Republicans or Indies...so I suppose that you could say that a "neutral" position aligns with "concern" about continuing emissions. In that sense, perhaps there isn't a completely symmetrical balance in the magnitude of the association. Was that your point?

But even if that is true, I think that there is a blanket reality that views on climate change (across the spectrum) are associated with political ideology at least in the United Statesvery important caveat if you step over into the realm of drawing conclusions such as that increased scientific literacy or reliance on abstract reasoning cause increased polarization in views on climate change as, IMO, that suggests that cultural/social characteristics are at least partially cause (with political views serving as a mediator).

May 9, 2016 | Unregistered CommenterJoshua

dypoon -

One more thing...

==> Quoting this for truth, just so NiV isn't the only one saying it here. I can tell everyone that's exactly how my own brain works; I'm very conscious of this process when I'm doing it. And in general, I've yet to hear an actual argument against the rationality of this approach in an attention-limited environment.

Of course that's an approach that most (all) people take. But the interesting question, IMO, is why people do that. IMO, they largely do it to prove their own opinions right, specifically to find flaws in the counterargument, and for the purpose of running the conflicting evidence through a confirmation bias filter as they do so - not as some objective exploration to openly explore if they are, in fact, wrong. That isn't to say that I never change my mind as I read further, or that there isn't some element of "truth-seeking" when people did deeper into opposing arguments - just that the "motivation" for "checking arguments" that stand in contrast to their own beliefs shouldn't just be taken at face value. Simply stating that approach as a general tendency and accepting its rationality doesn't go deep enough, IMO,

May 9, 2016 | Unregistered CommenterJoshua

@Asheley--

I'd agree w/ you about correlatoin. There's directional bias, in that sense.

but you'd agree, right, that correlation between ideology & "belief" isn't evidence that liberals are forming their opinions on the issue in a manner that evinces better, less biased *reasoning*?

One might infer that except for all the evidence that we have at this point that "beliefs" about climate change are uncorrelatd with understanding of even the most basic rudiments of climate science; people who believe & who don't believe hold equally absurd undertandings -- the same ones in fact.

What they 'believe' measures only their political or cultural identity. Getting it "right" is just "luck" of being on team that happens to have right identity-defining stance on that issue, not a testament to superiority of one side's reasoning on the issue in question or generally.

The sad thing is that those who reason the most proficiently on both sides are the most likely to be polarized on "belief..."

I suppose the only sadder thing is that people don't generally recognize this.

May 10, 2016 | Registered CommenterDan Kahan

Dan and Asheley, I think you have some major competition here from John Oliver on science, bullshit and even p-hacking here from John Oliver. http://news.nationalpost.com/arts/television/is-science-bulls-t-john-oliver-attacks-the-medias-tendency-to-turn-all-scientific-studies-into-gossip.

(Note that Gossip is the new Bullshit for proper newspaper headlines).

May 10, 2016 | Unregistered CommenterGaythia Weis

@GAythia-- well, no need to compete. We can all be on same side. Thanks for pointer!

May 10, 2016 | Registered CommenterDan Kahan

"Regarding my comment about conservatives and climate change, I merely meant to point out that there *IS* a clear relationship between conservative ideology and belief in human-caused climate change. [...] Do you argue that this relationship among ideology and climate change belief doesn't exist?"

No. It was actually the immediately preceding sentence that I'd noticed:
"If we actually have really solid evidence that conservatives are wrong on something, that is totally great and fine to publish. For instance, we can demonstrate a really clear liberal versus conservative bias in belief of climate change."

There's certainly solid evidence of a strong liberal/conservative difference in beliefs about whether conservatives are wrong on climate change - but given that this is precisely one of the topics where political biases matter, a scientist should be asking whether they're only judging the conservative position on climate change to be wrong because they're a liberal themselves and that's what liberals believe, or because it's actually wrong.

It's the symmetry question, which has long been discussed around here. Conservatives and liberals are affected by the same biases - everybody's brains work more or less the same way. But having identified such a bias, everybody has a tendency to only apply the principle to their ideological opponents, never to themselves.

Most social scientists are liberals (diversity and nondiscrimination targets don't always apply in academia), so most social science papers assume liberal positions to be the truth, and therefore direct their attention to figuring out what cognitive biases lead conservatives especially to get everything so wrong. They see an asymmetry the perception of which is necessarily affected by their own biases - of the exact type of bias they're studying! Hence the irony.

May 11, 2016 | Unregistered CommenterNiV

PostPost a New Comment

Enter your information below to add a new comment.

My response is on my own website »
Author Email (optional):
Author URL (optional):
Post:
 
Some HTML allowed: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <code> <em> <i> <strike> <strong>