OKay, yesterday I promised to say more about the information-exposure experiment we conducted to test the conjecture that “science curiosity” tends to negate politically biased information processing.
Maybe first though I should say something about why this sort of result isn’t an obvious one.
Or actually why it is is obvious-- but why a result the other way would have been obvious, too!
The best studies, in my view, are ones that test opposing plausible conjectures. This is the upshot of the “more things are plausible than are true” principle, which I attribute to Duncan Watts (2011).
It’s the premise, basically, of his cool book “Everything is Obvious: Once you know the Answer.” Because more explanations for interesting social phenomena are plausible than are actually true, if one doesn’t use empirical methods to extricate the true (or more likely true) from the sea of plausible but false explanations, one drowns in a sea of just-so story-telling.
But of course, once one does the work of presenting valid empirical evidence that furnishes more reason to believe one plausible conjecture rather than its rival, someone will inevitably trot out the boring OCTUSW--"Of course--that's unsurprising--so what"—response.
To which the answer is, YAIIFTOTBTTWHBEO!, or “Yup; and if I’d found the opposite to be true, that would have been equally ‘obvious’! Aren’t you glad, then, that I actually went to the trouble of trying to generate some actual evidence instead of just lazily taking a bunch of plausible behavioral mechanisms, adding water & stirring—to produce the instant pseudo-science profundity that passes for decision science in op-ed pages & best-selling books?”
Indeed, I make a point of doing only studies about which someone could say, "Of course, that's unsurprising so what" no matter which way the study result comes out!
But by the time you say all this, of course, Mr. or MS OCTUSW has moved on to some other topic about which he or she can make this or some equally penetrating remark.
Why would a result the opposite of what we found—viz., that highly science curious individuals, unlike less curious ones, willingly expose themselves to evidence that confounds their political predispositions—not have been particularly surprising?
The answer is “motivated system 2 reasoning” – or MS2R.
MS2R refers to the tendency of the reasoning proficiencies associated with science comprehension to magnify rather than abate politically motivated reasoning (PMR)—the tendency to conform evidence to one’s political predispositions.
Cognitive reflection, numeracy, science literacy—they all do that (Kahan in press).
Does that outcome seem obvious, too?
But the opposite effect—the tendency of proficiency in these sorts of reasoning abilities to temper political polarization—certainly is plausible, and any evidence that they do would certainly have been “obvious—once one knew the answer.”
Most cognitive biases—from base rate neglect to the availability heuristic, from ratio bias to the conjunction fallacy—reflect an overreliance on the rapid, intuitive, affect-driven “System 1” information processing as opposed to the more deliberative, conscious, dispassionate “System 2” kind characteristic of good “scientific” reasoning.
PMR compromises truth-convergent Bayesian reasoning in a manner akin to these biases. So why wouldn’t one expect it, too, to be attributable to overreliance on heuristic, system 1 reasoning ?
But false. Tons of observational & experimental data at this point show that cognitive reflection, numeracy, science literacy, etc., are all associated with greater political polarization.
Under the conditions that generate PMR, people use their science-comprehension reasoning proficiencies to reinforce their biased assimilation of evidence to the position that coheres with their political predispositions.
Now science curiosity—just like cognitive reflection, numeracy, knowledge of basic science facts, etc.—is cognitive element of science comprehension.
I went over this in a post a couple of days ago that showed that people high in science curiosity are significantly more likely to be high in science comprehension than are those who are low in science curiosity.
Can you see now why it would have been perfectly plausible to surmise—and perfectly obvious to find—that science curiosity, like these other elements of science comprehension, magnify political polarization?
But there’s a perfectly respectable conjecture the other way: that unlike these other elements of science comprehension, science curiosity involves an appetite to be surprised—to experience the awe and wonder of contemplating surprising insights derived from the signature methods of science. Maybe the habitual exercise of that disposition develops habits of mind that counteract rather than accentuate PMR.
Maybe! Or maybe not!
Only one way to tell . . . . Do a valid empirical study.
Oh-- & then do another, & another & another –and progressively update one’s views on the respective probability of these two perfectly plausible hypotheses—viz., science curiosity amplifies & science curiosity mitigates PMR.
So there you go, Mr./MS. OCTUSW.
We are now ready to (re)turn to the more interesting question: what was the evidence we relied on and how much reason does it give us to credit the “science curiosity mitigates” PMR hypotheses.
But I've said enough for one day, so I’ll have to do that “tomorrow.”
Again, though, if your insides are being consumed by curiosity on the experiment design and results, don't suffer--just download our paper & read it . . . . right now!
Kahan, D.M. The Politically Motivated Reasoning Paradigm. Emerging Trends in Social & Behavioral Sciences, (in press_b), http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2703011
Watts, D.J. Everything is Obvious: Once You Know the Answer: How Common Sense Fails (Atlantic Books, 2011).