follow CCP

Recent blog entries
popular papers

What Is the "Science of Science Communication"?

Climate-Science Communication and the Measurement Problem

Ideology, Motivated Cognition, and Cognitive Reflection: An Experimental Study

'Ideology' or 'Situation Sense'? An Experimental Investigation of Motivated Reasoning and Professional Judgment

A Risky Science Communication Environment for Vaccines

Motivated Numeracy and Enlightened Self-Government

Ideology, Motivated Cognition, and Cognitive Reflection: An Experimental Study

Making Climate Science Communication Evidence-based—All the Way Down 

Neutral Principles, Motivated Cognition, and Some Problems for Constitutional Law 

Cultural Cognition of Scientific Consensus

The Tragedy of the Risk-Perception Commons: Science Literacy and Climate Change

"They Saw a Protest": Cognitive Illiberalism and the Speech-Conduct Distinction 

Geoengineering and the Science Communication Environment: a Cross-Cultural Experiment

Fixing the Communications Failure

Why We Are Poles Apart on Climate Change

The Cognitively Illiberal State 

Who Fears the HPV Vaccine, Who Doesn't, and Why? An Experimental Study

Cultural Cognition of the Risks and Benefits of Nanotechnology

Whose Eyes Are You Going to Believe? An Empirical Examination of Scott v. Harris

Cultural Cognition and Public Policy

Culture, Cognition, and Consent: Who Perceives What, and Why, in "Acquaintance Rape" Cases

Culture and Identity-Protective Cognition: Explaining the White Male Effect

Fear of Democracy: A Cultural Evaluation of Sunstein on Risk

Cultural Cognition as a Conception of the Cultural Theory of Risk

« Two theories of "values," "identity" & "politically motivated reasoning" | Main | Weekend update: "Color" preprint of " 'Ideology' vs. 'Situation Sense' "! »

The "asymmetry thesis": another PMRP issue that won't go away

I feel like I've done 10^8 posts on this .... That's wrong: I counted, and in fact I've done 10.3^14.

But that's because it's a difficult question. Or at least is if one treats it as one of "measurement" & "weight of the evidence."  I remain convinced that it is not of great practical significance--that is, even if "motivated reasoning" and like dynamics are "asymmetric" across the ideological spectrum (or cultural spectra) that define the groups polarized on policy-consequential facts, the evidence is overwhelming and undeniable that members of all such groups are subject to this dynamic, & to an extent that makes addressing its general impact -- rather than singling out one or another group as "anti-science" etc. -- the proper normative aim for those dedicated to advancing enlightened self-govt.

But issues of "measurement" & "weight of the evidence" etc. are still, in my view, perfectly legitimate matters of scholarly inquiry. Indeed, pursuit of them in this case will, I'm sure, enlarge knowledge, theoretical and practical.

"Asymmetry" is an open question--& not just in the sense that nothing in science is ever resolved but in the sense that those on both "sides" (i.e., those who believe politically motivated reasoning is symmetric and those who believe it is asymmetric) ought to wonder enough about the correctness of their own position to wish that they had more evidence.

Here's an excerpt from my The Politically Motivated Reasoning Paradigm survey/synthesis essay addressing the state of the "debate":

4. Asymmetry thesis

The “factual polarization” associated with politically motivated reasoning is pervasive in U.S. political life. But whether politically motivated reasoning is uniform across opposing cultural groups is a matter of considerable debate (Mooney 2012).

In the spirit of the classic “authoritarian personality” thesis (Adorno 1950), one group of scholars has forcefully advanced the claim that it is not. Known as the “asymmetry thesis,” their position links biased processing of political information with characteristics associated with right-wing political orientations. Their studies emphasize correlations in observational studies between conventional ideological measures and scores on self-report reasoning-style scales such as “need for closure” and “need for cognition” and on personality-trait scales such “openness to experience” (Jost, Glaser, Kruglanski & Sulloway 2003; Jost, Hennes & Lavine 2013).

But the research that the “neo-authoritarian personality” school features supplies weak evidence for the asymmetry thesis. First, the reasoning style measures that they feature are of questionable validity. It is a staple of cognitive psychology that defects in information processing are not open to introspective observation or control (Pronin 2007) –a conclusion that applies to individuals high as well as more modest in cognitive proficiency (West, Meserve & Stanovich 2012). There is thus little reason to believe a person’s own perception of the quality of his reasoning is a valid measure of the same.

Indeed, tests that seek to validate such self-report reasoning style scales consistently find them to be inferior in predicting the disposition to resort to conscious, effortful information processing than performance-based measures such as the Cognitive Reflection Test and Numeracy (Toplak, West & Stanovich 2011; Liberali, Reyna, Furlan & Pardo 2011). Those measures, when applied to valid general population samples, show no meaningful correlation with party affiliation or liberal-conservative ideology (Kahan 2013; Baron 2015).

More importantly, there is no evidence that individual differences in reasoning style predict vulnerability to politically motivated reasoning. On the contrary, as will be discussed in the next part, evidence suggests that proficiency in dispositions such as cognitive reflection, numeracy, and science comprehension magnify politically motivated reasoning (Fig. 6).

Ultimately, the only way to determine if politically motivated reasoning is asymmetric with respect to ideology or other diverse systems of identity-defining commitments is through valid experiments. There are a collection of intriguing experiments that variously purport to show that one or another form of judgment—e.g., moral evolution, willingness to espouse counter-attitudinal positions, the political valence of positions formed while intoxicated, individual differences in activation of “brain regions” etc.—is ideologically asymmetric or symmetric (Thórisdóttir & Jost 2011; Jost, Nam, Jost & Van Bavel 2013; Eidelman et al. 2012; Crawford & Brandt 2013; Schreiber, Fonzo et al. 2013). These studies vary dramatically in validity and insight. But even the very best and genuinely informative ones (e.g., Conway, Gideon, et al. 2015; Liu & Ditto 2013; Crawford 2012) are in fact examining a form of information processing distinct from PMRP and with methods other than the PMRP design or its equivalent.

One study that did use the PMRP design found no support for the “asymmetry thesis” (Kahan 2013). In it, individuals of left- and right-wing political outlooks displayed perfectly symmetric forms of politically motivated fashioning in evaluating evidence that people who reject their group’s position on climate change have been found to engage in open-minded evaluation of evidence (Figure 5).

But that’s a single study, one that like any other is open to reasonable alternative explanations that themselves can inform future studies. In sum, it is certainly reasonable to view the “asymmetry thesis” issue as unresolved. The only important point is that progress in resolving it is unlikely to occur unless studied with designs that reflect PMRP design or ones equivalently suited to support inferences consistent with the PMRP model.


Adorno, T.W. The Authoritarian personality (Harper, New York, 1950).

Baron, J. Supplement to Deppe et al.(2015). Judgment and Decision Making 10, 2 (2015).

Conway, L.G., Gornick, L.J., Houck, S.C., Anderson, C., Stockert, J., Sessoms, D. & McCue, K. Are Conservatives Really More Simple‐Minded than Liberals? The Domain Specificity of Complex Thinking. Political Psychology (2015), advance on-line, DOI: 10.1111/pops.12304.

Crawford, J.T. The ideologically objectionable premise model: Predicting biased political judgments on the left and right. Journal of Experimental Social Psychology 48, 138-151 (2012).

Eidelman, S., Crandall, C.S., Goodman, J.A. & Blanchar, J.C. Low-Effort Thought Promotes Political Conservatism. Pers. Soc. Psychol. B. (2012).

Jost, J.T., Glaser, J., Kruglanski, A.W. & Sulloway, F.J. Political Conservatism as Motivated Social Cognition. Psychological Bulletin 129, 339-375 (2003).

Jost, J.T., Hennes, E.P. & Lavine, H. “Hot” political cognition: Its self-, group-, and system-serving purposes. in Oxford handbook of social cognition (ed. D.E. Carlson) 851-875 (Oxford University Press, New York, 2013).

Kahan, D. M.. Ideology, Motivated Reasoning, and Cognitive Reflection. Judgment and Decision Making, 8, 407-424 (2013).

Liberali, J.M., Reyna, V.F., Furlan, S., Stein, L.M. & Pardo, S.T. Individual Differences in Numeracy and Cognitive Reflection, with Implications for Biases and Fallacies in Probability Judgment. Journal of Behavioral Decision Making 25, 361-381 (2012).

Nam, H.H., Jost, J.T. & Van Bavel, J.J. “Not for All the Tea in China!” Political Ideology and the Avoidance of Dissonance. PLoS ONE 8(4) 8, :e59837. doi:59810.51371/journal.pone.0059837 (2013).

Pronin, E. Perception and misperception of bias in human judgment. Trends in cognitive sciences 11, 37-43 (2007).

Thórisdóttir, H. & Jost, J.T. Motivated Closed-Mindedness Mediates the Effect of Threat on Political Conservatism. Political Psychology 32, 785-811 (2011).

Toplak, M., West, R. & Stanovich, K. The Cognitive Reflection Test as a predictor of performance on heuristics-and-biases tasks. Memory & Cognition 39, 1275-1289 (2011).

West, R.F., Meserve, R.J. & Stanovich, K.E. Cognitive sophistication does not attenuate the bias blind spot. Journal of Personality and Social Psychology 103, 506 (2012).


PrintView Printer Friendly Version

EmailEmail Article to Friend

Reader Comments (9)

I remain convinced that it is not of great practical significance--that is, even if "motivated reasoning" and like dynamics are "asymmetric" across the ideological spectrum (or cultural spectra) that define the groups polarized on policy-consequential facts, the evidence is overwhelming and undeniable that members of all such groups are subject to this dynamic, & to an extent that makes addressing its general impact -- rather than singling out one or another group as "anti-science" etc. -- the proper normative aim for those dedicated to advancing enlightened self-govt.

Indeed. That context is so important, but it seems that the focus on proving asymmetry (which, of course, just by coincidence I'm sure, always seems to be more prevalent among the ideological groups not doing the analysis) garners much more energy and attention.

Although, I suppose, the asymmetry I see (between focus on the cross-group differences and the focus on much greater cross-group similarities) could be a product of my "motivations." :-)

Of course, another important consideration, (at least IMO), is that intra-group differences are probably much greater than inter-group differences.

December 21, 2015 | Unregistered CommenterJoshua

Maybe I misunderstood the experiment you mention, but as a long-time reader I think I am part of audience for which you want this to be obvious, so without further research...

If I am confident that I have come to (say) a skeptical position through open-minded reflection, then I ought to believe the probability that skeptics score well on a test is greater for a test that "supplies good evidence of how reflective and open-minded someone is". That is, the hypothesis that the quoted phrase is true shoud be deemed more likely after I am given the datum that skeptics scored well. Hence, the pattern in the graphs is what I would expect from a population of ideal Bayesians who have come to their distinct positions from rational reflection on different prior information.

To be sure, zero political motivation for everyone is one type of symmetry, so this is not a counter to the main point of the post. But when you say this design is using PMRP, in contrast to others, it leaves me wondering what I have misunderstood.

December 22, 2015 | Unregistered CommenterAn Igyt


In your illustration, you are indeed relying on your priors concerning (a) your own open-mindeded, reflectiveness & (b) the agreement of others who are open-minded & reflective people w/ you on climate change in order to determine the validity of the CRT.

Nothing in Bayesianism says you can’t do that, I suppose. But if you do, you will never revise your priors on (a) or (b), no matter how strong the contrary evidence you encounter happenes to be.

The paper offers an illustration of confirmation bias tht is pretty comparable to this:

I think the odds that human beings are causing global warming [or “that deep geologic isolation is a reliable form of nuclear waste disposal”; or “concealed-carry laws increase violent crime”] is 10^-4 : 1. The National Academy of Sciences just issued an “expert consensus report” concluding that humans are causing global warming [or “that deep geologic isolation of nuclear wastes is safe” or “that there is no evidence that concealed carry laws increase crime”]. That’s not right—so I’ll assign the report a likelihood ratio of 1, or less [cf. Nyhan, Reifler, Richey & Freed 2014] since obviously the authors of the report were not genuine experts.

In order to avoid confirmation bias,, you must determine the weight or likelihood ratio of new information on some basis independent of your priors. So here you must rely on "truth-convergent criteria" independent of your priorts to determine the validity of CRT--if you have any interest in whether your current beliefs about your own open-mindedeness & reflection, & about whether others who are believe are skeptical of climate change, are correct.

Does this seem right?

December 22, 2015 | Registered CommenterDan Kahan

There might be more to this than I thought at first. My illustration and the example on the paper are indeed comparable, so let's stick with the latter.

When you speak of a likelihood ratio for "the report", I take this to mean the ratio for two hypotheses, one being, "These are real experts, honest and forthright," the other the converse. This is a perfectly good set of hypotheses, and there is no reason I ought to assign it a degenerate prior (i.e., give probability one to either of them). To dismiss the reasoning you describe as "bias" seems to assert the contrary.

Suppose I think there is 90% probability that NAS is upright. Learning that NAS has issued a report proclaiming something I believe has only a one in 10,000 chance of being true should cause a large (negative) update of my belief that NAS is upright, and a small (positive) update of my other belief.

So while it is true good Bayesians must determine the likelihood of new information on some basis independent of our priors, it is far from trivial what set of hypotheses we ought to consider as the space over which we define these distributions.

Perhaps to assess real bias, we ought ask for priors on the probity of NAS as well as the hypotheses NAS will judge, and see how both are updated when a finding is revealed.

A minor aside: I am not climate change skeptic, at least if the hypothesis is phrased as "human beings are causing global warming". That was just for illustration.

December 22, 2015 | Unregistered CommenterAn Igyt


Thx.This is a helpful exchange-- you are raising a very reasonable point that other thoughtful people (@MW-- are you there?) have raised at various points. It is important for me to figure out not only whether my response to you & them is correct; but also how I can present my position in a way that does a better job anticipating this very reasonable reaction.

Here goes ...

When you speak of a likelihood ratio for "the report", I take this to mean the ratio for two hypotheses, one being, "These are real experts, honest and forthright," the other the converse.

Actually no. The rival hypotheses in the illustration are “human activity is the principal cause of climate change” (H1) & “human activity is not the principal cause of climate change” (H2). The NAS report (E) is “new information.”

Bayes Theorem says that we should revise our assessment of the probability of H1 (expressed in odds) by multiplying it by the likelihood ratio (LR) of the NAS report —Pr(E|H1)/Pr(E|H2).

But Bayes’s Theorem doesn’t tell us how to figure out the LR; it only tells us what to do with it: use it as the factor by which we multiply our prior odds. . . .

In the illustration, the person looking at the NAS report is assigning the NAS report an LR based on its inconsistency with his priors. He’s saying, since I know H1 is true, the NAS report writers must be knaves or fools. A knave/fool would say the evidence supports AGW whether or not H1 or H2 was true; indeed, knaves or fools would be especially likely to say the evidence supports AGW even if H1 was false—that’s the way knaves and fools operate! So LR ≤ 1.

Reasoning in that way, the person in the illustration will never be convinced of AGW, no matter how strong the evidence that he is wrong.

The only way he'd every figure out he was wrong about AGW would be to make the effort to assess new evidence based on valid or truth-convergent criteria independent of what he already abelieves about AGW. But we can see that he is decidedly averse to doing that.

This is, I think, how most people in the world reason. No matter what their position is on AGW. As a result of PMR.

I am interested in testing *that* hypothesis--one that is meaningful in relation to rival ones such as "there is polarization on AGW b/c Rs are closed minded, dogmatic thinkers, unlike Ds, who are open-minded & reflective."

I could, in theory, test the PMR hypothesis by measuring people's AGW beliefs “before” (priors) & “after” (posteriors) being exposed to an NAS Report or some other source of evidence. But it turns out that it is very hard to do—there are “nasty confounds” in such a design.

The occasion for your comment is an experimental design crafted to avoid those confounds.

It is one that seeks to measure the LR that subjects assign *one & the same piece of evidence* on AGW (or scientific consensus on AGW etc) conditional on an experimental manipulation of the consistency of that evidence with the position that predominates among people who share their partisan identity.

If the subjects *do* adjust the LR they assign the evidence based on its consistency with the position that predominates in their group, then that is evidence in favor of the PMR explanation of "fact polarization."

If that is how people reason outside lab, they we they will not alter their positions no matter what that evidence genuinely shows.

December 23, 2015 | Registered CommenterDan Kahan

"Reasoning in that way, the person in the illustration will never be convinced of AGW, no matter how strong the evidence that he is wrong."

He'll never be convinced by Argument from Authority, since there's always an alternative hypothesis available that the Authority isn't one.

But if a more scientific/objective form of evidence is presented instead, and try as they might they are unable to find any flaw, then they will believe. That's why those with low scientific literacy are less polarised - they're less good at finding flaws and reasons not to believe, and thereby forced to fall back more on Authority arguments.

Individually, it converges poorly, because all the checks filtering out falsehoods are only applied half the time even among those capable, and never among those not. But run as an adversarial system, with systematic scepticism by adversaries being applied, society as a whole can converge. The individuals involved will still carry on arguing, but argument is a sign of healthy debate. Problems only arise when one side is excluded or eliminated from the community. A diversity of views is key.

And a happy new year to you all.

December 27, 2015 | Unregistered CommenterNiV


The people who reason this way happily accept authority. So long as it tells them what they want to hear.

And they also will accept the "authority" of anyone presneting them evidence that "only the other side" does that-- but reject the "authority" of anyone who presents them evidence that "their side" does it too.

I'm sure by next yr, everyone will be over that.

December 27, 2015 | Registered CommenterDan Kahan

What is a politically identifiable group, or cultural identity identifier? Not "Democrats" or "Republicans"

For example, here is a politically identifiable group that is both of those at once!

"He is strongest among Republicans who are less affluent, less educated and less likely to turn out to vote. His very best voters are self-identified Republicans who nonetheless are registered as Democrats. It’s a coalition that’s concentrated in the South, Appalachia and the industrial North"

Back in time, (white) blue collar workers might have been union, and Democrats. Factory bosses would be Republican. One was the party of business interests, one of the workers. Prior to that, Republicans were abolitionists, and white southerners were Democrats. Now, of course, there are other cultural identifiers such as abortion.

Trump, of course knows a bit about the cultural cognitive trigger attributes of the word "disgusting". Speaking of Hillary Clinton taking a restroom break he said:

I know where she went. It's disgusting, I don't want to talk about it," he added. "No, it's too disgusting. Don't say it, it's disgusting."

December 31, 2015 | Unregistered CommenterGaythia Weis


There aren't any groups identifiable by labels of any sort.

It's interesting, though, to try to figure out who supports Trump.

I suspect, though, too, it will not matter much over long term. Just as "who is tea party" doesn't seem to matter-- in my own data collections, the pct who identify themselves as such has shrunk by about 75% in the last yr. They are still there; they still feelthe way they do. But the label has little meaning

January 1, 2016 | Registered CommenterDan Kahan

PostPost a New Comment

Enter your information below to add a new comment.

My response is on my own website »
Author Email (optional):
Author URL (optional):
Some HTML allowed: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <code> <em> <i> <strike> <strong>