follow CCP

Recent blog entries
popular papers

What Is the "Science of Science Communication"?

Climate-Science Communication and the Measurement Problem

Ideology, Motivated Cognition, and Cognitive Reflection: An Experimental Study

'Ideology' or 'Situation Sense'? An Experimental Investigation of Motivated Reasoning and Professional Judgment

A Risky Science Communication Environment for Vaccines

Motivated Numeracy and Enlightened Self-Government

Ideology, Motivated Cognition, and Cognitive Reflection: An Experimental Study

Making Climate Science Communication Evidence-based—All the Way Down 

Neutral Principles, Motivated Cognition, and Some Problems for Constitutional Law 

Cultural Cognition of Scientific Consensus
 

The Tragedy of the Risk-Perception Commons: Science Literacy and Climate Change

"They Saw a Protest": Cognitive Illiberalism and the Speech-Conduct Distinction 

Geoengineering and the Science Communication Environment: a Cross-Cultural Experiment

Fixing the Communications Failure

Why We Are Poles Apart on Climate Change

The Cognitively Illiberal State 

Who Fears the HPV Vaccine, Who Doesn't, and Why? An Experimental Study

Cultural Cognition of the Risks and Benefits of Nanotechnology

Whose Eyes Are You Going to Believe? An Empirical Examination of Scott v. Harris

Cultural Cognition and Public Policy

Culture, Cognition, and Consent: Who Perceives What, and Why, in "Acquaintance Rape" Cases

Culture and Identity-Protective Cognition: Explaining the White Male Effect

Fear of Democracy: A Cultural Evaluation of Sunstein on Risk

Cultural Cognition as a Conception of the Cultural Theory of Risk

« New paper: "The Politically Motivated Reasoning Paradigm" | Main | Disentanglement principle corollary no. 16a: "You don't have to choose ... between being a reality tv star & being excited to learn what science knows (including what it knows about how people come to know what's known by science)" »
Friday
Dec112015

"*Scientists* & identity-protective cognition? Well, on the one hand ... on the other hand ... on the *other* other hand ..." A fragment

Scientific proof that "skeptical" scientisis are biased!From something I'm working on. I'll post the rest of it "tomorrow," in fact.  But likely this section will end up on the cutting room floor (that's okay; there's lots of stuff down there & eventually I expect to find use for most of it someplace; is a bit of fire hazard, though . . . .)

6. Professional judgment

Ordinary members of the public predictably fail to get the benefit of the best available scientific evidence when their collective deliberations are pervaded by politically motivated reasoning. But even more disturbingly, politically motivated reasoning might be thought to diminish the quality of the best scientific evidence available to citizens in a democratic society (Curry 2013).

Not only do scientists—like everyone else—have cultural identities. They are also highly proficient in the forms of System 2 information processing known to magnify politically motivated reasoning.   Logically, then, it might seem to follow that scientists’ factual beliefs about contested societal risks are likely skewed by the stake they have in conforming information to the positions associated with their cultural groups.

But a contrary inference would be just as “logical.” The studies linking politically motivated reasoning with the disposition to use System 2 information processing have been conducted on general public samples, none of which would  have had enough scientists in them to detect whether being one matters. Unlike nonscientists  with  high CRT or Numeracy scores, scientists use professional judgment when they evaluate evidence relevant to disputed policy-relevant facts. Professional judgment consists in habits of mind, acquired through training and experience and distinctively suited to specialized forms of decisionmaking.  For risk experts, those habits of mind confer resistance to many cognitive biases that can distort the public’s perceptions(Margolis 1996).  It is perfectly plausible to believe that one of the biases that professional judgments can protect risk experts from is “politically motivated reasoning.”

Here, too, neither values nor positions on disputed policies can help decide between these competing empirical claims. Only evidence can.  To date, however, there are few studies of how scientists might be affMy spidey sense tells me this is a future classic!ected by politically motivated reasoning, and the inferences they support are equivocal. 

Some observational studies find correlations between the positions of scientists on contested risk issues and their cultural or political orientations (Bolsen, Druckman, & Cook 2015; Carlton, Perry-Hill, Huber & Prokopy 2015).  The correlations, however, are much less dramatic than ones observed in general-population samples.  In addition, with one exception (Slovic, Malmfors et al. 1995), these studies have not examined scientists’ perceptions of facts in their own domains of expertise.

This is an important point. Professional judgment inevitably comprises not just conscious analytical reasoning proficiencies but perceptive sensibilities that activate those proficiencies when they are needed (Bedard & Biggs 1991; Marcum 2012). Necessarily preconscious (Margolis 1996), these sensibilities reflect the assimilation of the problem at hand to an amply stocked inventory of prototypes. But because these prototypes reflect the salient features of problems distinctive of the expert’s field, the immunity from bias that professional judgment confers can’t be expected to operate reliably outside the domain of her expertise (Dane & Pratt 2007).

A study that illustrates this point examined legal professionals.  In it, lawyers and judges, as well as a sample of law students and members of the public, were instructed to perform a set of statutory interpretation problems. Consistent with the PMRP design, the facts of the problems—involving behavior that benefited either illegal aliens or “border fence” construction workers; either a pro-choice or pro-life family counseling clinic—were manipulated in a manner designed to provoke responses consistent with identity protective cognition in competing cultural groups.  The manipulation had exactly that effect on members of the public and on law students.  But it didn’t on either judges or lawyers:  despite the ambiguity of the statutes and the differences in their own cultural values, those study subjects converged in their responses, just as one would predict if one expected their judgments to be synchronized by the common influence of professional judgment. Nevertheless, this relative degree of resistance to identity-protective reasoning was confined to legal-reasoning tasks: the judges and lawyers’ respective perceptions of disputed societal risks—from climate change to marijuana legalization—reflected the same identity-protective patterns observed in the general public and student samples (Kahan, Hoffman, Evans, Lucci, Devins & Cheng in press). Extrapolating, then, we might expect to see the same effect in risk experts: politically motivated divisions on policy-relevant facts outside the boundaries of their specific field of expertise; but convergence guided by professional judgment inside of them.

Or alternatively we might expect convergence not on positions that are true necessarily but that are so intimately bound up with a field’s own sense of identity that acceptance of them has become a marker of basic competence (and hence a precondition of recognition and status) within it.  In Koehler (1993), scientists active  in either defending or discrediting scientific proof of “parapsychology” were instructed to review the methods of a fictional ESP study. The result of the study was experimentally manipulated: Half the scientists got one that purported to find evidence supporting ESP, the other half one that purported to find evidence not supporting it. The scientists’ assessments of the quality of the study’s methods turned out to be strongly correlated with the fit between the representeveeeeeeeeeery interesting ....d result and the position associated with the scientists’ existing positions on the scientific validity of parapsychology—although Koehler found that this effect was in fact substantially more dramatic among the “skeptic” than the “non-skeptic” scientists. 

Koehler’s study reflects the core element of the PMRP design: the outcome measure was the weight that members of opposing groups gave to one and the same piece of evidence conditional on the significance of crediting it. Because the significance was varied in relation to the subjects’ prior beliefs and not their stake in some goal independent of forming an accurate assessment, the study can and normally is understood to be a demonstration of confirmation bias.  But obviously, the “prior beliefs” in this case were ones integral to membership in opposing groups, the identity-defining significance of which for the subjects was attested to by how much time and energy they had devoted to promoting public acceptance of their respective groups’ core tenets. Extrapolating, then, one might infer that professional judgment might indeed fail to insulate from the biasing effects of identity-protective cognition scientists whose professional status has become strongly linked with particular factual claims.

So we are left with only competing plausible conjectures.  There’s nothing at all unusual about that. Indeed, it is the occasion for empirical inquiry—which here would take the form of the use of the PMRP design or one of equivalent validity to assess the vulnerability of scientists to politically motivated reasoning—both in and outside of the domains of their expertise, and with and without the pressure to affirm “professional-identity-defining” beliefs.

References

Curry, J. Scientists and Motivated Reasoning. Climate Etc. (Aug. 20, 2013)

Bedard, J.C. & Biggs, S.F. Pattern recognition, hypotheses generation, and auditor performance in an analytical task. Accounting Review, 622-642 (1991).

Bolsen, T., Druckman, J.N. & Cook, F.L. Citizens’, scientists’, and policy advisors’ beliefs about global warming. The ANNALS of the American Academy of Political and Social Science 658, 271-295 (2015).

Carlton, J.S., Rebecca, P.-H., Matthew, H. & Linda, S.P. The climate change consensus extends beyond climate scientists. Environmental Research Letters 10, 094025 (2015).

Dane, E. & Pratt, M.G. Exploring Intuition and its Role in Managerial Decision Making. Academy of Management Review 32, 33-54 (2007).

Kahan, D.M., Hoffman, D.A., Evans, D., Devins, N., Lucci, E.A. & Cheng, K. 'Ideology' or 'Situation Sense'? An Experimental Investigation of Motivated Reasoning and Professional Judgment. U. Pa. L. Rev. 164 (in press).

Koehler, J.J. The Influence of Prior Beliefs on Scientific Judgments of Evidence Quality. Org. Behavior & Human Decision Processes 56, 28-55 (1993).

Marcum, J.A. An integrated model of clinical reasoning: dual-process theory of cognition and metacognition. Journal of Evaluation in Clinical Practice 18, 954-961 (2012).

Margolis, H. Dealing with risk : why the public and the experts disagree on environmental issues (University of Chicago Press, Chicago, IL, 1996).

Margolis, H. Patterns, thinking, and cognition : a theory of judgment (University of Chicago Press, Chicago, 1987).

Slovic, P., Malmfors, T., Krewski, D., Mertz, C.K., Neil, N. & Bartlett, S. Intuitive toxicology .2. Expert and lay judgments of chemical risks in Canada. Risk Analysis 15, 661-675 (1995).

 

A classic!


PrintView Printer Friendly Version

EmailEmail Article to Friend

Reader Comments (23)

Beyond academia: Who is supplying the political motivated reasoning? Here is an example from the Cruz campaign of data driven political motivated messaging and influence peddling based on psychological profiling created from apparently surreptitiously collected Facebook information: http://www.theguardian.com/us-news/2015/dec/11/senator-ted-cruz-president-campaign-facebook-user-data

"As part of an aggressive new voter-targeting operation, Cambridge Analytica – financially supported by reclusive hedge fund magnate and leading Republican donor Robert Mercer – is now using so-called “psychographic profiles” of US citizens in order to help win Cruz votes, despite earlier concerns and red flags from potential survey-takers.:"

December 11, 2015 | Unregistered CommenterGaythia Weis

I think that any efforts to ascertain "the vulnerability of scientists to politically motivated reasoning—both in and outside of the domains of their expertise, and with and without the pressure to affirm “professional-identity-defining” beliefs." would be limited by the ability of the investigators to separate their own beliefs, and concepts as to what is or is not motivated reasoning, from the study. Science is always a perpetual work in progress. I think that retrospective studies on how misguided trajectories became embedded, and what it took to redirect them in more fruitful directions might prove more useful. And lead to methods for strengthening those corrective measures.

In terms of public perceptions of science, the media doles out a pretty heavy dose of "scientist say" articles. Particularly with regards to medical issues, for which doctors subscribe to liability limiting protocols, these become quite absolute, until they are not. Breast cancer screening and treatment is one example. So the public is aware of changes in science.

I think this is why absolutist declarations are sometimes met with skepticism. With regards to climate change, I think that emphasizing the breadth of scientific support, from studies in many different fields, is more likely to be convincing.

December 11, 2015 | Unregistered CommenterGaythia Weis

@Gaythia--

why not a study like Koehler's? How would "investigator's own bias/belief" create a problem? 2 researchers w/ different hypotheses could even agree on design 1st -- then see what results tell them.

In any case, my priors are 1:1, so I shouldn't have any problem here, right?

December 11, 2015 | Unregistered Commenterdmk38

Wouldn't the notion of a "professional" be an exceedingly odd category to find it's way into a scientific theory about motivated reasoning? "Expert" you could almost see, but "professional"? What possible work could "professional" be doing here? Unless it is just a clumsy way of getting at expertise after all?

December 13, 2015 | Unregistered CommenterPaul J

"Wouldn't the notion of a "professional" be an exceedingly odd category to find it's way into a scientific theory about motivated reasoning?"

I suspect "professional" here refers to the cultural phenomenon in which people feel obligated to hold to high standards of behaviour ("professionalism") as part of their 'identity' or as a tactical means of preserving the reputation and authority of the profession's 'brand'. Being a "professional" is not simply a matter of being paid for doing it.

The question is whether membership of the tribe of "professional scientist" overrides membership of tribes like "liberal" and "conservative". Dan's view used to be that yes it does - I find this post interesting because it suggests he may be reconsidering the evidence for that.

Personally, I think it depends. Some scientists make a greater effort than others, and it's a bigger problem in some areas of science than others. In particular, some sciences have a significant majority of practitioners being of one particular political tribe, which results in groupthink. There are shared assumptions, shared motivations, and nobody in a position to challenge them, which leads to bad science. It's not deliberate - the people involved genuinely believe that they come so often to politically congenial conclusions because that's the way things are.

Sometimes there is a more serious problem that occurs in institutionalised science. When younger scientists are heavily dependent on the community of scientists for their careers, there is a natural tendency not to rock the boat and annoy the more senior or powerful members of the community. Scientists who disagree keep quiet about it, and other scientists take the lack of opposition to represent a consensus, and assume the majority view must be correct without looking into the evidence for themselves.

A classic example of that was when a young student called Chandrasekhar solved Einstein's general relativity equations to discover 'black holes', but the leader of the astrophysics community Eddington rejected them on intuitive grounds as a nonsense solution, and applied pressure to make sure the rest of the astrophysicists did the same. Even Einstein published papers declaring them to be impossible (incorrectly). Several prominent physicists privately agreed with Chandrasekhar that his solution was correct, but wouldn't say so publicly when it would mean starting a war with Eddington. Chandrasekhar was ostracized, and black hole research was delayed for decades - basically until all the old guard had retired or died. Max Planck said of it that "Science advances one funeral at a time."

So it's not just the politically contentious bits of science where this can happen. It's a common failing of human institutions, whenever bureaucracy takes over, and career, fame, and influence start to compete as motivation with pride in the standards of the profession. Adding politics and fanatical ideology external to science can only make this already poisonous environment more toxic.

"Expertise" has nothing at all to do with these cultural issues. "Professionalism" does. Although maybe "institutional science" might be a better term to use here, depending on which aspect you want to emphasise.

December 13, 2015 | Unregistered CommenterNiV

@Paul J & @Niv:

I don't want to preempt your discussion, which is interesting.

But some observations:

1. Perhaps is confuting, as @Paul J suggets, to use "expert" and "professional" interchangeably. I mean to use "professional judgment" as a way to characterize a style of information processing that consists in the integrated preconscious recognition capacities that excite or activate conscious analytical skills suited for performing specialized tasks -- ones that fall outside the relam of ordinary, everyday experience. In my view, experts are distinguished by their use of PJ in this sense. PJ will preditibaly protect those who possess it from "cognitive biases" that might otherwise defeat the performance of those tasks (that is the nerve of the sources I cited).

Which biases, and and under what cirucmstances, are empirical questions, of course. In my view, we don't have enough evidence to draw very strong inferences about PJ and "identty-protective cognition"

2. I am not averse to @NiV's view that "professionalism" might be a character disposition that evinces resolve to resist partisan biases. Some "professionals" lack this, certainly! But even those who have it can't by force of good intentions (in my view) succeed necessarily; the sort of effect of that PMR exerts evades conscious detection and control. Their attaining of their goal, then, will depend on the successful cultivation of habits of mind -- PJs -- that successfully insulate them.

3. @NiV perceives a "change" in my view. Yes & no.

To begin, my view, or at least my aspiration for it, is permanently provisional; whatever I believe at the moment is just my current best tally of the available evidence--which in this case, I've always stressed, is not strong in any event.

But if I have "changed" in my view, it is as a result of reflecting on how the PJ of the "legal professional" (or "expert" if @ Paul J prefers) workds. I have previously offered our study of how legal PJ seems to counter PMR as reason for thinking PMR is in fact one of the bises that PJ defects.

But on reflection, I think I neglected to give significant account to what legal PJ consists in: the calibrating of a lawyer's own apprehension of the "right result" in legal matters to ones *other legal professionals* view as "right" --nothing more, nothing less. Lacking any criteria of validating their shared sense of "rightness" as being "just" or "true", the only thing that such PJ achieves -- even in rebuffing PMR -- is the protection of the legal professional's status within a particular professoinal community....

To be 'truth convergent," PJ needs to be validated independent of its "reliability" -- that is, its consistency across the members of the profession who exercise it. It must be validated as suited for acheiving the ends that normatively justify having the sort of expertise that members of that profession exercise -- to figure out truth in the case of scientists, to do justice (according to the principles of a liberal democratic society) for lawyers

Just as confident as I am that professioal chicksexers can validate their PJ in this way, I am confident scientists, in the main can validate theirs. Not so sure about lawyers.

But this is a provisional assessment as I say (should go w/o saying!).

And it leaves open the possibility that wi/ any particular pocket of a profession whose PJ is validated, there will be a breakdown -- a corruption of PJ w/ a proliferating sense of "rightness" that is tied to status and not "truth" or whatever the norms is that should validate PJ....

Whether that has happened can be determined, though, only w/ methods suited to drawing valid inferences. It's right to worry-- constantly -- that this has happened w/i any domain of science. But we need evidence-- & not plausible conjectures to address the concern. More things are plausible than are true. Always.

December 13, 2015 | Registered CommenterDan Kahan

@Dan Kahan

You say: "I mean to use "professional judgment" as a way to characterize a style of information processing that consists in the integrated preconscious recognition capacities that excite or activate conscious analytical skills suited for performing specialized tasks -- ones that fall outside the relam of ordinary, everyday experience."

And so it becomes an analytic truth that professional judgement is a protective against motivated reasoning--it's efficacy is packed into the definition!!

I could be wrong, but I fear the problem has not yet been stated with anything like full dialectical precision.

December 13, 2015 | Unregistered CommenterPaul J

==> "But we need evidence--...."

Indeed. Not selectively collected anecdotes in support of a preexisting theory, that more than likely, not coincidentally, aligns perfectly with an ideological/partisan/"motivated" orientation.

I will give NiV credit for coming up with carefully selective "evidence" that I haven't seen before. Usually "skeptics" point to the same set of examples, (plate tectonics, ulcers, the lipid theory) - which suggests to me the notion of "the exception proves the rule" more than anything else. How would we compare the number of times that conventional wisdom was overturned rather easily in expert communities to the number of times that it encountered hardened resistance that in the end turned out to be "unprofessional?" Perhaps the reason that the same set of examples is pointed to so frequently is that there really aren't all that many (particularly in comparison to the pool of counterexamples).


Of course, maybe that's a reflection of my own "motivation." Which just goes back to Dan's point, and which underlines why Dan's work is of value. Dan's hypotheses may well be wrong (hate to break it to you, Dan), but his approach is an an evidence-based island in a sea of over-certain conclusions being drawn from unreliable sampling and a near complete lack of scientifically-based evidence gathering. it's always interesting to see people who are trained and experienced with testing out the differences between correlation and causation, so frequently draw conclusions without a solid basis of scientifically controlled evidence. It's interesting because it shows just how pervasive is motivated reasoning.

December 13, 2015 | Unregistered CommenterJoshua

@Paul J--


And so it becomes an analytic truth that professional judgement is a protective against motivated reasoning--it's efficacy is packed into the definition!!

Nope.

The whole point of the post is to pose the question whether PJ counteracts PMR & to stress that the answer depends on valid empirical tests.

December 14, 2015 | Registered CommenterDan Kahan

"Perhaps the reason that the same set of examples is pointed to so frequently is that there really aren't all that many"

The reason is that you need to cite examples of scientific revolutions people have heard of. The point is that it should be something they already know about, or famous enough they can easily check - else they're forced to just take your word for it.

"How would we compare the number of times that conventional wisdom was overturned rather easily in expert communities to the number of times that it encountered hardened resistance that in the end turned out to be "unprofessional?""

Good question. On what basis can you trust expert communities if you don't already know the answer? Have you checked?

And anyway, this question is rather like asking how we compare the number of honest transactions on the internet versus the number of fraudulent ones, when deciding whether to send your bank details to that nice Nigerian prince who just emailed. I think it's safe to say most transactions are honest, or nobody would use the internet for commerce, so does that imply we should trust blindly? Set out your chain of logic.

December 14, 2015 | Unregistered CommenterNiV

NiV -

==> "The reason is that you need to cite examples of scientific revolutions people have heard of. "

You think it's more persuasive when arguing for a generalizable pattern to keep citing the same limited set of examples rather than list a much larger set of examples?

==> "The point is that it should be something they already know about, or famous enough they can easily check - else they're forced to just take your word for it."

I would imagine that the set of examples that are easily researched is far larger than the limited set of examples I've already heard of. I'm not forced to take your word for it if you give examples I haven't already heard of - as is the case with the example you provided earlier in this thread. I haven't heard of it. I can easily check to see the history.

In short, at least for me, it would be much more effective for you to provide a long list of examples I haven't heard of and that could easily be researched than for you to list the same examples over and over. But maybe that's just me, eh? :-)

Anyway, if you want to generalize....try this:

http://www.npr.org/2013/08/02/172139216/how-did-a-mistake-unlock-one-of-spaces-mysteries

==> " so does that imply we should trust blindly? "

Reminds me of your question of "Is that so bad?" form that recent thread. I'm not suggesting blind trust as an alternative to arguments that are well-supported by evidence. I'm suggesting that we neither generalize to assume blind trust in the professionalism of experts or generalize that a strong prevalence of shared opinion among experts justifies an assumption of unprofessional groupthink. I think that what's more useful is to consider examples either way in full context, and to avoid the "motivated" habit of generalizing from unrepresentative samples.

And of course, we should strive to view any particular example on the evidence relevant to that particular context.

In the end, I would suspect that trying to generalize about the propensity of experts to uphold professional standards irrespective of their ideological "motivations" can only be loosely instructive, at best, anyway. It can be useful for maybe understanding broadly how experts working within their own field of expertise compare to non-experts trying to reason when dealing with complex matters where they lack the necessary expertise. But how much information does that give us for assessing a gap between public views on a particular issue (such as climate change) and a prevalent view among a particular set of experts in that field? Without another whole set of evidence that is very specific to that context, I'd say not much at all.

In the end, I think it's unlikely that a deeper understanding of the general patterns will help us to understand the gap between public and expert opinions on climate change. And even if it did, not many people who are seeking to confirm their biases on that issue would pay attention to Dan's evidence-based findings anyway.

December 14, 2015 | Unregistered CommenterJoshua

Actually, this is the link I was looking for:

http://www.npr.org/2013/05/10/182861376/exploring-an-ever-expanding-universe

Related, but more on point.

December 14, 2015 | Unregistered CommenterJoshua

Dan says: "I mean to use "professional judgment" as a way to characterize a style of information processing that consists in the integrated preconscious recognition capacities that excite or activate conscious analytical skills suited for performing specialized tasks -- ones that fall outside the relam of ordinary, everyday experience."

Can we not summarize the "specialized tasks" that climate science (as an example) is aimed toward as "determining the truth about climate", to put it simply?

If so, then a person is in possession of professional judgement qua climate scientist only if she has the "integrated preconscious recognition capacities that excite or activate conscious analytical skills suited for" getting at the truth about climate.

December 14, 2015 | Unregistered CommenterPaul J

@Paul J--


Sure. But careful not to make a psychological dynamic into a "definition."

PJ is indeed characterized by the habits of mind suited for specialized tasks. But how well suited -- & whether it could be made beter suited still -- are empirical issues.

There's no contradiction between having professional judgment in a domain & also being susceptible to particular biases that are hostile to the performance of that task; in such a case PJ needs to be improved.

Finding such defects is in fact one of the principle objects of the empirical study of professional education.

PJ is also a proficiency that can admit of degrees. Someone can "have it" & be able to do specialized tasks more reliably than someone without it, and still be less proficient in it than another. You & I act on this understanding when we shop for drs, automechanics, soothsayers etc.

December 15, 2015 | Registered CommenterDan Kahan

"You think it's more persuasive when arguing for a generalizable pattern to keep citing the same limited set of examples rather than list a much larger set of examples?"

I don't think anything would be persuasive to somebody determined not to be persuaded.

It's merely an illustrative example to enhance the plausibility of a principle that ought to already be obvious to any student of human nature (or bureaucracy). It's not an important element of the argument - more of a cliche or shorthand expression used to remind the reader that they already ought to know this. Which of course everybody does, unless they're deliberately looking for reasons to disagree.

OK, so you're not persuaded. So?

"Actually, this is the link I was looking for:"

Thanks. I found that entertaining.

You could say, if you like, that the problem we have at the moment is the discovery of "dark climate". We had this theory that said it would continue to accelerte, but then we made these obervations that told us it didn't. So we invent this "dark climate" stuff that's holding it back. Is it temporary? Is it permanent? What is it? We don't know. There's a bunch of theories and speculations, but the observations are right on the edge of what's possible, and we can't answer that just as yet. But what we can say is that where just a few years ago we thought we knew about pretty much everything, all of a sudden we've realised that there's 90% of the universe we know virtually nothing about. We know it's there (we think), but we've no idea what it is or how it works. It's a very exciting time. But in retrospect it does make the words "The science is settled!" a rather unfortunate choice of thing to say!

December 16, 2015 | Unregistered CommenterNiV

Oy.

==> "I don't think anything would be persuasive to somebody determined not to be persuaded."

A non-sequitur. I was discussing what would be persuasive to someone who's open to persuasion. Why bring up an irrelevancy?

==> " It's merely an illustrative example to enhance the plausibility of a principle that ought to already be obvious to any student of human nature (or bureaucracy)."

Not sure what "it" refers to there....but I don't think it's obvious that repeating the same few examples over and over would be more persuasive (to those whose minds are not already made up) than a long list of examples that are easily checked. Or course, if your intent is to preach to the choir, and confirm the biases of people who are more than happy to generalize from unrepresentative sampling (something they wouldn't know about if they hadn't placed the same examples repeated over and over into full context), then yes, repeating the same small set of examples over and over is effective, but effective at what? Yes, it is obvious human nature that people who aren't open to persuasion and are simply seeking to confirm biases would find listing the same examples over and over without placing them into context so as to generalize from unrepresentative samples to be effective.

==> " It's not an important element of the argument - more of a cliche or shorthand expression used to remind the reader that they already ought to know this."

Lol. So your point is that you're not trying to persuade anyone of anything, but seeking to dog whistle to a preexisting bias. Thanks for admitting that. I'm surprised to see you being so straightforward there.

==> " Which of course everybody does, unless they're deliberately looking for reasons to disagree."

Everyone not looking to disagree knows what? That repeating the same list of limited examples over and over is a rhetorical device to advance the cause of generalizing from unrepresentative sampling?

==> "OK, so you're not persuaded. So?"

OK, so you think that repeating the same limited set of examples over and over without grounding them in context to shed light on whether or not their generalizable is effective. So?
.

==> "You could say, if you like, that the problem we have at the moment is the discovery of "dark climate". We had this theory that said it would continue to accelerte, but then we made these obervations that told us it didn't. So we invent this "dark climate" stuff that's holding it back. Is it temporary? Is it permanent? What is it? We don't know. There's a bunch of theories and speculations, but the observations are right on the edge of what's possible, and we can't answer that just as yet. But what we can say is that where just a few years ago we thought we knew about pretty much everything, all of a sudden we've realised that there's 90% of the universe we know virtually nothing about. We know it's there (we think), but we've no idea what it is or how it works. It's a very exciting time."

OK, glad you got something out of it. Too bad that you didn't discuss to how the example speaks to the generalizability of whether or not "expert" communities are inherently resistant to changing their views, fundamentally, when presented with evidence that stands in contrast to their previous paradigms. Of course, it's only one example, so generalizing would obviously be problematic. But perhaps it might help you to reconsider certainty in your facile generalization?

==> " But in retrospect it does make the words "The science is settled!" a rather unfortunate choice of thing to say!"

It's always interesting to me how many more times, by orders of magnitude, I see "skeptics" talking about "The science is settled," in comparison to the number of times I see non-"skeptic" scientists use that expression.

I wonder why that is? Couldn't be that their using a rhetorical device of generalizing from unrepresentative sampling, now could it?

December 16, 2015 | Unregistered CommenterJoshua

NiV -

==> " It's not an important element of the argument - more of a cliche or shorthand expression used to remind the reader that they already ought to know this."

BTW, on behalf of the many readers who may be too overwhelmed with gratitude or too shy to speak up I, I'd like to thank you for taking the time to (condescend to) remind them of what they already "ought" to know.

December 16, 2015 | Unregistered CommenterJoshua

@Joshua:

& I'd like to remind the 1 or 2 people in the world who aren't regular readers of this blog & thus don't already know this, that you & @NiV are in fact filled w/ affection for one another -- more so even than are Donal Trump & Ted Cruz.
"Shorthands" & like idioms can be so disorienting to the uninitiated!

December 16, 2015 | Registered CommenterDan Kahan

Dan -

Thanks for that, but don't sell yourself short.

In addition to that 1 or 2 who didn't already know it, perhaps you've also provided a service in reminding some others about what they already "ought" to know.

December 17, 2015 | Unregistered CommenterJoshua

"A non-sequitur. I was discussing what would be persuasive to someone who's open to persuasion. Why bring up an irrelevancy?"

You said: "In short, at least for me, it would be much more effective for you to provide a long list of examples I haven't heard of and that could easily be researched than for you to list the same examples over and over. But maybe that's just me, eh?" I assumed it was just you we were talking about.

"but I don't think it's obvious that repeating the same few examples over and over would be more persuasive (to those whose minds are not already made up) than a long list of examples that are easily checked."

That's your opinion. I've already said that I think most people would already understand the principle.

"Or course, if your intent is to preach to the choir, and confirm the biases of people who are more than happy to generalize from unrepresentative sampling"

It's not meant to be a sample. It's meant to be an illustrative example of the principle.

" Yes, it is obvious human nature that people who aren't open to persuasion and are simply seeking to confirm biases would find listing the same examples over and over without placing them into context so as to generalize from unrepresentative samples to be effective."

Oh good.

Hang on. What? You said a moment ago you thought it wouldn't be persuasive, now you're saying it would be, to the right set?

"Lol. So your point is that you're not trying to persuade anyone of anything, but seeking to dog whistle to a preexisting bias. Thanks for admitting that. I'm surprised to see you being so straightforward there."

I always try to be straightforward. It's like trying to tell people that if they stand out in the rain, they'll get wet. I stood out in the rain once, and I got very wet. Oh hang on a sec, that's a very small sample size - perhaps some people won't find that persuasive? My 'got wet' anecdote is probably some sort of dog whistle to people who already think they know that standing in the rain gets you wet, and I'm just "preaching to the choir", or whatever.

Let's see a proper survey with a few thousand sample points before we jump to any hasty conclusions about rain being wet, eh?

"That repeating the same list of limited examples over and over is a rhetorical device to advance the cause of generalizing from unrepresentative sampling?"


Booooring...!

"OK, so you think that repeating the same limited set of examples over and over without grounding them in context to shed light on whether or not their generalizable is effective."

Do you know what a cliche is?

"Too bad that you didn't discuss to how the example speaks to the generalizability of whether or not "expert" communities are inherently resistant to changing their views, fundamentally, when presented with evidence that stands in contrast to their previous paradigms."

I already addressed this. Sometimes experts do. Sometimes experts don't. Trying to count up the cases where they do and don't is like trying to count up the cases of honest and dishonest transactions on the internet. The point is not whether it's universal or even a majority of experts who do this. The point is whether it's a possibility worth considering. Past experience says it is.

I know that you think otherwise, but I'm not inclined to take your judgement on the matter.

"It's always interesting to me how many more times, by orders of magnitude, I see "skeptics" talking about "The science is settled," in comparison to the number of times I see non-"skeptic" scientists use that expression."

Yeah. They stopped using it when they realised how many people recognised it for the anti-scientific nonsense it was. The point is, though, that they did use it. How could anyone who had a genuine understanding of scientific method not have realised? It's like saying "Why should I make the data available to you, when your aim is to try and find something wrong with it" or "It won't be easy to dismiss out of hand as the math appears to be correct" or "it would be odious requirement to have scientists document every line of code so outsiders could then just apply them instantly.”

Your lot don't have to keep on saying it time and time again - the fact you said it once is sufficient to make the point.

"BTW, on behalf of the many readers who may be too overwhelmed with gratitude or too shy to speak up I, I'd like to thank you for taking the time to (condescend to) remind them of what they already "ought" to know."

You're very welcome! But I'm sure they already knew that.

--

"& I'd like to remind the 1 or 2 people in the world who aren't regular readers of this blog & thus don't already know this, that you & @NiV are in fact filled w/ affection for one another"

Oh, yes, indeed. My primary purpose in coming to places where people generally disagree with me is to test out my arguments and opinions. Only ideas that can survive a hostile audience are worth keeping. And Joshua is always reliable in trying to find the holes in anything I say. That's very useful, and I'm eternally grateful.

I cannot praise a fugitive and cloister'd vertue, unexercis'd & unbreath'd, that never sallies out and sees her adversary, but slinks out of the race, where that immortall garland is to be run for, not without dust and heat.

Milton, Areopagitica, 1644.

Dust and heat are not to be criticised - they're a sign that things are working properly.

December 17, 2015 | Unregistered CommenterNiV

You said: "In short, at least for me,,,,

I added that as a parenthetical statement after first discussing my thoughts for a more general frame. I guess you missed that?


That's your opinion. ...

Right, and stated as such. I try to avoid stating my opinions as fact. Get my drift?


It's not meant to be a sample. It's meant to be an illustrative example of the principle.

This is beautiful! Take an unrepresentative sample (or a sample what you haven't established or even tried to establish is representative)...soften the focus....add a rose hue...cue the violins and voila! You have an "illustrative example."

Have you given an "ullustrative example" of an outlier? You don't know. So what is the value of your "illustrative example?" It may have value as a rhetorical device. But with who? With someone who is open to persuasion? Maybe, but only if they're persuaded by "illustrative examples" w/o checking to see if they're outliers. With someone who finds "illustrative examples" useful if they have some idea of whether they're representative? No.

As you acknowledged when you described your goal of condescending to remind people of what they already "ought" to know.

Perhaps you think that is "professional." If so, more power to you. And perhaps that explains your defense of Cruz.

I don't happen to think that is professional.

Hang on. What? You said a moment ago you thought it wouldn't be persuasive, now you're saying it would be, to the right set?

I guess I wasn't clear. My point was that contributing to bias confirmation is not the same as "persuading" someone.


It's like trying to tell people that if they stand out in the rain, they'll get wet. I stood out in the rain once, and I got very wet. Oh hang on a sec, that's a very small sample size - perhaps some people won't find that persuasive?

If you haven't determined whether that is a representative example. Were you holding an umbrella? Where you standing under an awning?

My 'got wet' anecdote is probably some sort of dog whistle to people who already think they know that standing in the rain gets you wet, and I'm just "preaching to the choir", or whatever.

Your dog whistle is offering an "example" without defining the context. What is it an example of? An outlier? A representative situation?

Oy.


Let's see a proper survey with a few thousand sample points before we jump to any hasty conclusions about rain being wet, eh?

My god, man. You are simply reinforcing your unproven, nay, untested assertion that your example of groupthink and resistence to change is analogous to the certainty that if you stand out in the rain (with no umbrella, with no shelter) you will get wet. This is exactly my point!!! You don't have proof, and you haven't even tested your supposition.

This is why Dan's work is of value, because it works to counteract against such sloppy and unprofessional "motivated" advocacy.


Do you know what a cliche is?

Another of these questions? No, NiV, I don't know what a cliche is. It isn't a matter of me disagreeing with you. The only explanation is that I don't know what a cliche is.


I already addressed this. Sometimes experts do. Sometimes experts don't. Trying to count up the cases where they do and don't is like trying to count up the cases of honest and dishonest transactions on the internet. The point is not whether it's universal or even a majority of experts who do this. The point is whether it's a possibility worth considering. Past experience says it is.

Of course it's "worth considering" as a possibility. I've already indicated that multiple times.

I know that you think otherwise, ....

Oy. Again with the bizarre interpretations of what I'm saying? Like "What's so bad about that?"

but I'm not inclined to take your judgement on the matter.

Oy. I'm not remotely suggesting that you "take [my] judgement"...


Yeah. They stopped using it when they realised how many people recognised it for the anti-scientific nonsense it was.

No, actually, there are very few cases of non-"skeptic" scientists even using the expression. Again with the generalizing from unrepresentative sampling, eh? oh, wait...sorry....soft focus...rose hue....cue violins...."illustrative example."

The point is, though, that they did use it.

Really? Who is "they?" And what did "they" mean when they said it (e.g., did them mean that ACO2 causes warming)? And how does that compare with the amount that "skeptics" characterize that as being a typical expression used by non-"skeptical" scientists?

How could anyone who had a genuine understanding of scientific method not have realised?

Realized what? And who is it that you're talking about? Al Gore?


"BTW, on behalf of the many readers who may be too overwhelmed with gratitude or too shy to speak up I, I'd like to thank you for taking the time to (condescend to) remind them of what they already "ought" to know."


Oh, yes, indeed. My primary purpose ...

Primary purpose? Where did you see that I referred to your "primary purpose?"

December 18, 2015 | Unregistered CommenterJoshua

A reasonable surmise could be that the pattern recognition aspect of PJ displaces system 1 thinking, as in keeping the relevant brain area occupied. This might even be testable via MRI based experiments.

December 27, 2015 | Unregistered CommenterHal Morris

@HalMorris--

I can think of a couple ways to understand your surmise.

1st would be that pattern recognition (PR; let's multiply the number of abbreviations until Homeland Security-- HS-- is convinced we are speaking in code & decides to "rendition" us to Morocco) is a kind of white noise blocker that cancels out the influence of system 1 thinking so that PJ, which consists in good old reliable System 2, can do its business w/o distraction.

Could be.

But this would be very much at odds w/ view of PJ of Margolis & the other scholars (the real hard-core PJ scholars).. Their model attacks orthodox System 1/System 2. Whereas the orthodox view sees the two as "discrete & hierarchical," they see the two as integrated & reciprocal: PJ involves unconscious "system 1" perception that reliably apprehends the occasions for "system 2" thinking & activates the same; the PJ recognition facutly is, in turn, fine-tuned & calibrated by a process that involves system 2 reflection.

On this view, we wouldn't expect PJ to be just a kind of perimeter-securing blocking agent, in way I described.

Which might not be what you meant anyway.

But whether or not you meant it, it's somethig one could test, certainly (I'm not sure if fMRI could do it, but that's another story....)

2. The other account of what you are saying-- one consistent wtith Margolis view-- would be that PR associated w/ PJ operates primarily by keeping system 1 sensibilities trained on the proper target when professional must assess evidence the potential significance of which could be threateing/affirming to her cultural (non-professional) identity.

Basically, PMR *is* a kind of rational apprehension by PR of the significance of crediting evidence for the stake a person has in recognizing & giving proper effect to information that has an identity-threatening or -affirming signficiance. The "reciprocal-integrated" system 1/system 2 reasoning apparatus propertly handles informatoin in a manner geared to protecting cultural identity.

Where PJ is working properly, in contrast, *its* distinctive PR should take priority over the PR of identity protectoin, Accordingly, the PJ should reliably classify situation types with reference to the professional judgment of "getting the right answer" & summon analytical reasoning (system 2) in a manner necessary to do *that*. Necessarily, then, it is preempting the sort of rational information processing of PMR-- the rationality of which presupposes the actor's stake in assessign information in an identity-protective way.

On this account, PJ failures would consist in the displacement of the integrated-reciprocal operation of Sys 1/2 characteristic of PJ by the integrated-reciprocal characteristic of identity protection....

These are definitely testeable conjectures, too.

December 29, 2015 | Registered CommenterDan Kahan

PostPost a New Comment

Enter your information below to add a new comment.

My response is on my own website »
Author Email (optional):
Author URL (optional):
Post:
 
Some HTML allowed: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <code> <em> <i> <strike> <strong>