follow CCP

Recent blog entries
popular papers

Science Curiosity and Political Information Processing

What Is the "Science of Science Communication"?

Climate-Science Communication and the Measurement Problem

Ideology, Motivated Cognition, and Cognitive Reflection: An Experimental Study

'Ideology' or 'Situation Sense'? An Experimental Investigation of Motivated Reasoning and Professional Judgment

A Risky Science Communication Environment for Vaccines

Motivated Numeracy and Enlightened Self-Government

Making Climate Science Communication Evidence-based—All the Way Down 

Neutral Principles, Motivated Cognition, and Some Problems for Constitutional Law 

Cultural Cognition of Scientific Consensus
 

The Tragedy of the Risk-Perception Commons: Science Literacy and Climate Change

"They Saw a Protest": Cognitive Illiberalism and the Speech-Conduct Distinction 

Geoengineering and the Science Communication Environment: a Cross-Cultural Experiment

Fixing the Communications Failure

Why We Are Poles Apart on Climate Change

The Cognitively Illiberal State 

Who Fears the HPV Vaccine, Who Doesn't, and Why? An Experimental Study

Cultural Cognition of the Risks and Benefits of Nanotechnology

Whose Eyes Are You Going to Believe? An Empirical Examination of Scott v. Harris

Cultural Cognition and Public Policy

Culture, Cognition, and Consent: Who Perceives What, and Why, in "Acquaintance Rape" Cases

Culture and Identity-Protective Cognition: Explaining the White Male Effect

Fear of Democracy: A Cultural Evaluation of Sunstein on Risk

Cultural Cognition as a Conception of the Cultural Theory of Risk

Tuesday
Mar132018

Do Type S and Type M errors reflect confirmation bias?!

 0. What's this about?

I’ve written a number of posts on the value of Bayesian likelihood ratios (a heuristic cousin of the “Bayes Factor”) as an “evidentiary weight” statistic generally, and its value in particular as a remedy for the inferential barrenness of p-values and related statistics used to implement “null hypothesis testing.”

In this post, I want to call attention to another virtue of using likelihood ratios: the contribution they can make to protecting against the type 1 error risk associated with underpowered studies. Indeed, I’m going to try to make the case for using LRs for this purpose instead of a method proposed by stats legend & former Freud expert Andrew Gelman (Gelman & Carlin 2014).

As admittedly elegant, and as admittedly valuable they have been in making people aware of click here to see this kind of amazing stuff!a serious problem, G&C’s statistical indexes inject a form of confirmation bias into the practical assessment of the weight to be afforded empirical studies. Using LRs to expose the “type 1” error risk associated with underpowered studies avoids that.

Or at least that’s what I think. I must be crazy, huh?

CONTINUE--IF YOU DARE!

Monday
Mar052018

What's worse? Macedonian "fake news" or Russian distortions of social proof?

This is “fake news,” Macedonia style:

Does it matter?  I have my doubts.

This is “fake evidence of social proof,” generated by the Russian government and its operatives and disseminated widely through Facebook:

Does it matter?  I have my fears that it does.

In a session at the just-concluded Society for Personality and Social Psychology in Atlanta (slides here), I suggested two models of how misinformation, including fake news, works.  One, the “passive aggregator” model, assumes a credulous public being manipulated by economic and political interest groups via the dissemination of “fake news” either directly or through client news providers.  

 

PA is a “supply side” model.

The “motivated public” model (MP) is a “demand side” alternative.

It asserts that factionalized groups of citizens are unconsciously motivated to interpret elections and public policymaking generally for their effect in promoting or impeding sectarian public philosophies. On this account, opportunistic information providers can earn a profit from disseminating “fake news” congenial to one or another sectarian group. The Macedonian “fake news” mills, for example, made millions from the “clicks” on the commercial advertisements that adorned their postings.

Work in the science of science communication, I suggested, supports MP. If this is right, we shouldn’t worry overmuch about Macedonian “fake news”: already conditioned to selectively credit or discredit true news based on their cultural predispositions, the members of factionalized cultural groups can be expected to reliably identify and credit fake news that supports their side and to dismiss the rest as bogus.

Even in a world with no fake news, cultural partisans would find (distort) more than enough “true” information to support their positions

The real issue is why people adopt this form of information processing to begin with.

That’s where Russia’s “fake signals of social proof” come in.

Citizens of diverse cultural orientations don’t contest evidence relating to each and every risk or policy-relevant fact.  They fight over only those that have become entangled with cultural identities, thereby transforming opposing positions on them into badges of membership in, and loyalty to, one or another sectarian faction.

Culturally diverse citizens adopt this posture toward risks and other policy-relevant facts when they observe others in their their affinity group embroiled in fights with their cultural adversaries. The information conveyed by such conflicts is not about the degree of harm associated with contested facts but about the status of positions on them as recognizable tokens of group identity.

This is where Russia comes in.  Its Facebook postings are not about the facts relating to one or another policy but about the facts of social proof—who is fighting whom over what.

As they saturate our democatic-deliberation environment with this form of pollution, the predictable—and indeed sought after—impact is the descent of our public discourse into intractable states of cultural division.

Forget the trivial question of how to “counteract” inherently-absurd stories about Hillary and pizza-gate.

The question is how to protect ourselves from the manipulation of public perceptions of cultural conflict in our society, and about what we should do in response to nations bent on infusing this poison into our public discourse.

Do you hear a dog whistle here?...

Wednesday
Feb142018

Science literacy *plus* science curiosity--a research program for enlightened self-government (lecture summary & slides)

Back from nearly a week in Ann Arbor, Michigan. On Friday & Sat attended symposium on “values, science & the public” convened by the Univ. of Mich.’s Philosophy Dep’t. I then spent Monday at the Institute for Social Research.

This is a condensed recap of the lecture I gave at ISR: “Science comprehension without curiosity is no virtue, and curiosity without comprehension no vice” (slides here).

I.  One aim of the CCP/APPC initiative on science of science communication is to integrate civic science literacy –Jon Miller’s decades’ long project (e.g., Miller 1998) – with John Dewey’s program to promote science curiosity.  Curiosity, Dewey (1910) argued, is essential not only to motivate acquisition of scientific literacy but also to activate citizens’ science comprehension when needed to make informed personal or public judgments.  To that, I would add that science curiosity is also necessary to temper the pernicious impact of identity-protective cognition, a dynamic that threatens to deny society the benefits of their citizens’ science literacy.

II.  Motivated System 2 reasoning (MS2R) is one of the ways in which science comprehension can be recruited into the service of a defensive and closed-minded style of cognition.. Contrary to the dominant “bounded rationality thesis,” higher proficiency in the types of critical reasoning essential to science comprehension doesn’t diminish polarization; on the contrary, it magnifies it.  Citizens  high in one or another critical reasoning proficiency use that endowment to ferret out information that supports their cultural group’s positions and to rationalize dismissal of everything else (Kahan 2015; Kahan et al. 2017a).

III. Science curiosity, in contrast, offsets MS2R.  Science curiosity directly negates the mental orientation associated with MS2R: whereas the latter generates a defensive, dismissive posture toward identity-threatening evidence, the former creates an appetite for surprising information that defies one’s expectations.  Because people who are disposed to be curious about science are more likely to expose themselves to information that challenges their political predispositions, they are less prone to polarization, and less likely to form opposing factions, as their science comprehension increases (Kahan et al. 2017b).

IV.  Science curiosity thus performs a critical role in determining the impact of higher levels of science comprehension.  Dewey, again, credited curiosity with citizens’ acquisition of scientific insight and with their reliable apprehension of the occasions for its deployment. The studies featured in this lecture tell us that science curiosity also does something else: it blocks the use of scientific reasoning to promote beliefs that signal diverse citizens’ membership in and loyalty to one or another opposing cultural group. The entanglement of reason in this dynamic is arguably the greatest threat to enlightened self-government that our society now faces.

V. What now? I have described an ambitious project—to integrate Miller’s research on scientific literacy with Dewey’s attention to science curiosity.  However, the tools at our disposal—including principally the CCP/APPC Science Curiosity Scale—are now suited for making at best only a modest contribution to that goal.  More work is necessary, for one thing, to improve SCS and to make its administration feasible for diverse audiences in diverse settings. Armed with such an instrument, researchers will then need to test various procedures for activating curiosity, particularly in citizens who aren’t as spontaneously curious as the ones we’ve focused on in our studies to date. Conducted initially in the lab, such research will then need to be reproduced in the field—indeed, in the numerous fields, from education to mass science communication to democratic politics, in which citizens come to know what science knows.

As far as we have come based on Miller’s research, we have just as far to go to understand how to enjoy the full benefits of a citizenry as science literate as the one Miller’s research envisions.

References

Miller, J.D. The measurement of civic scientific literacy. Public Understanding of Science 7, 203-223 (1998).

Dewey, J. How We Think (Boston: D.C. Heath & Co., 1910).

Kahan, D.M., Peters, E., Dawson, E.C. & Slovic, P. Motivated numeracy and enlightened self-government. Behavioural Public Policy 1, 54-86 (2017a).

Kahan, D.M., Landrum, A., Carpenter, K., Helft, L. & Hall Jamieson, K. Science Curiosity and Political Information Processing. Political Psychology 38, 179-199 (2017b).

Kahan, D.M. The Politically Motivated Reasoning Paradigm, Part 2: Unanswered Questions. in Emerging Trends in the Social and Behavioral Sciences (John Wiley & Sons, Inc., 2015).

 

Thursday
Feb082018

Let's play data jeopardy again! (lecture slides)

Gave a talk yesterday at the Center for Study of Public Choice at George Mason University. The audience there was great--filled with critical insights & contructive suggestions.

I again used the "data jeopardy" format, in which I present the data "answer" first & then invite the audience to guess the "question" that motivated the collection of the data.

So, slides are here; what's the question?

Wednesday
Feb072018

Miss the posts on the CCP dictionary/glossary, whatever? Well, here you go-- CRT & numeracy.

These go here.

Cognitive Reflection Test. A three-item assessment of the capacity and disposition to override judments founded on intuition. Regarded as the best meausure of a person’s propensity to overrely on heuristic, System 1 information processing as opposed to conscious, analytical System 2 information processing.  [Frederick, S. (2005), 19, J. Econ. Persp.; Kahneman & Frederick (2005), in The Cambridge handbook of thinking and reasoning  (pp. 267-293). Dated added: Feb. 7, 2018.]

Numeracy. An assessment test that measures the apptitude to reason well with quantative information and draw appropriate infereneces therefrom. [Peters, et al. (2005) , Psych. Sci., 5, 407-13.   Dated added: 2/7/18. Dated added: Feb. 7, 2018.]                     

 

Tuesday
Feb062018

A science of science communication manifesto ... a fragment 

From something I'm working on . . . .

Our motivating premise is that advancement of enlightened policymaking  depends on addressing the science communication problem. That problem consists in the failure of valid, compelling, and widely accessible scientific evidence to dispel persistent public conflict over policy-relevant facts to which that evidence directly speaks. As spectacular and admittedly consequential as instances of this problem are, entrenched public confusion about decision-relevant science is in fact quite rare. They are not a consequence of constraints on public science comprehension, a creeping “anti-science” sensibility in U.S. society, or the sinister acumen of professional misinformers.  Rather they are the predictable result of a societal failure to integrate two bodies of scientific knowledge: that relating to the effective management of collective resources; and that relating to the effective management of the processes by which ordinary citizens reliably come to know what is known (Kahan 2010, 2012, 2013).

The study of public risk perception and risk communication dates back to the mid-1970s, when Paul Slovic, Daniel Kahneman, Amos Tversky, and Baruch Fischhoff began to apply the methods of cognitive psychology to investigate conflicts between lay and expert opinion on the safety of nuclear power generation and various other hazards (e.g., Slovic, Fischhoff & Lichtenstein 1977, 1979; Kahneman, Slovic & Tversky 1982).  In the decades since, these scholars and others building on their research have constructed a vast and integrated system of insights into the mechanisms by which ordinary individuals form their understandings of risk and related facts. This body of knowledge details not merely the vulnerability of human reason to recurring biases, but also the numerous and robust processes that ordinarily steer individuals away from such hazards, the identifiable and recurring influences that can disrupt these processes, and the means by which risk-communication professionals (from public health administrators to public interest groups, from conflict mediators to government regulators) can anticipate and avoid such threats and attack and dissipate them when such preemptive strategies fail (e.g., Fischhoff & Scheufele 2013; Slovic 2010, 2000; Pidgeon, Kasperson & Slovic 2003; Gregory 2005; Gregory, McDaniels & Field 2001) .

Astonishingly, however, the practice of science and science-informed policymaking has remained largely innocent of this work.  The persistently uneven success of resource-conservation stakeholder proceedings, the sluggish response of localities the challenges posed by climate-change, and the continuing emergence of new public controversies such as the one over fracking—all are testaments (as are myriad comparable misadventures in the domain of public health) to the persistent failure of government institutions, NGOs, and professional associations to incorporate the science of science communication into their efforts to promote constructive public engagement with the best available evidence on risk.

This disconnect can be attributed to two primary sources.  The first is cultural: the actors most responsible for promoting public acceptance of evidence-based policymaking do not possess a mature comprehension of the necessity of evidence-based practices in their own work.  For many years, the work of policymakers, analysts, and advocates has been distorted by the more general societal misconception that scientific truth is “manifest”—that because science treats empirical observation as the sole valid criterion for ascertaining truth, the truth (or validity) of insights gleaned by scientific methods is readily observable to all, making it unnecessary to acquire and use empirical methods to promote its public comprehension (Popper 1968).

Dispelled to some extent by the shock of persistent public conflict over climate change, this fallacy has now given way to a stubborn misapprehension about what it means for science communication to be genuinely evidence based.  In investigating the dynamics of public risk perception, the decision sciences have compiled a deep inventory of highly diverse mechanisms (“availability cascades,” “probability neglect,” “framing effects,” “fast/slow information processing,” etc.). Used as expositional templates, any reasonably thoughtful person can construct a plausible-sounding “scientific” account of the challenges that constrain the communication of decision-relevant science. But because more surmises about the science communication problem are plausible than are true, this form of story telling cannot produce insight into its causes and cures. Only gathering and testing empirical evidence can.

Some empirical researchers have themselves contributed to the failure of practical communicators to appreciate this point. These scholars purport to treat general opinion surveys and highly stylized lab experiments as sources of concrete guidance for actors involved in promoting public engagement with information relevant to particular risk-regulation or related policy issues. Even when such methods generate insight into general mechanisms of consequence, they do not—because they cannot—yield insight into how those mechanisms can be brought to bear in particular circumstances.  The number of plausible surmises about how to reproduce in the field results that have been observed in the lab exceeds the number that truly will as well. Again, empirical observation and testing are necessary—now in the field.  The number of researchers willing to engage in field-based research, and unwilling to acknowledge candidly the necessity of doing so, has stifled the emergence of a genuinely evidence-based approach to the promotion of public engagement with decision-relevant science (Kahan 2014).

The second source of the disconnect between the practice of science and science-informed policymaking, on the one hand, and the science of science communication, on the other, is practical: the integration of the two is constrained by a collective action problem.  The generation of information relevant to the effective communication of decision-relevant science—including not only empirical evidence of what works and what does not but practical knowledge of the processes for adapting and extending it in particular circumstances—is a public good.  Its benefits are not confined to those who invest the time and resources to produce it but extend as well to any who thereafter have access to it.  Under these circumstances, it is predictable that producers, constrained by their own limited resources and attentive only to their own particular needs, will not invest as much in producing such information, and in a form amenable to the dissemination and exploitation of it by others, as would be socially desirable.  As a result, instead of progressively building on their successive efforts, each initiative that makes use of evidence-based methods to promote effective public engagement with policy-relevant science will be constrained to struggle anew with the recurring problems.

Fischhoff, B. & Scheufele, D.A. The science of science communication. Proceedings of the National Academy of Sciences 110, 14031-14032 (2013).

Gregory, R. & McDaniels, T. Improving Environmental Decision Processes. in Decision making for the environment : social and behavioral science research priorities (ed. G.D. Brewer & P.C. Stern) 175-199 (National Academies Press, Washington, DC, 2005).

Gregory, R., McDaniels, T. & Fields, D. Decision aiding, not dispute resolution: Creating insights through structured environmental decisions. Journal of Policy Analysis and Management 20, 415-432 (2001).

Kahan, D. Fixing the Communications Failure. Nature 463, 296-297 (2010).

Kahan, D. Making Climate-Science Communication Evidence Based—All the Way Down. In Culture, Politics and Climate Change: How Information Shapes Our Common Future, eds. M. Boykoff & D. Crow. (Routledge Press, 2014).

Kahan, D. Why we are poles apart on climate change. Nature 488, 255 (2012).

Kahan, D.M. A Risky Science Communication Environment for Vaccines. Science 342, 53-54 (2013).

Kahneman, D., Slovic, P. & Tversky, A. Judgment under uncertainty : heuristics and biases (Cambridge University Press, Cambridge ; New York, 1982).

Pidgeon, N.F., Kasperson, R.E. & Slovic, P. The social amplification of risk (Cambridge University Press, Cambridge ; New York, 2003).

Popper, K.R. Conjectures and refutations : the growth of scientific knowledge (Harper & Row, New York, 1968).

Slovic, P. The feeling of risk : new perspectives on risk perception (Earthscan, London ; Washington, DC, 2010).

Slovic, P. The perception of risk (Earthscan Publications, London ; Sterling, VA, 2000).

Tuesday
Jan302018

More glossary entries: pattern recognition, professional judgment & situation sense

Again, complete (or more complete) document here.

Pattern recognition. A cognitive dynamic in which a person recognizes some object or state of affairs by matching it (preconciously) to a rapidly conjured set of prototypes acquired through experience. [Source: Margolis, H. (1987), Patterns, Thinking, and Cognition (Univ. Chicago Press. Date added Jan. 29, 2018.] 

Professional judgment.  Domain specific “habits of mind” (most likely specialized forms of pattern recognition) that guide domain experts (e.g., judges). [Source: Margolis, H. (1996), Dealing with risk : why the public and the experts disagree on environmental issues. (University of Chicago Press.). Date added Jan. 29, 2018.] 

Situation sense. Karl Llewellyn’s description of domain-specific habits of mind, acquired through education and experience, that enable judges and lawyers to rapidly and reliably converge on case outcomes notwithstanding the indeterminacy of formal legal norms. [Source: Llewellyn, K. (1989), The Case Law System in America (M. Ansaldi, Trans.).  Date added Jan. 29, 2018.]

Monday
Jan292018

Meet the Millennials, part 4: Motivated System 2 reasoning ...

First, this (familiar) result --

Then this --

Do you see what I see? What does it all mean, if anything??

Sunday
Jan212018

Weekend update: who has more items for "Cultural Cognition Dictionary/Glossary/whatever"?

Okay, loyal listeners:

Version 1.0 of the CCP Dictionary/glossary/whatever ("DGW") can be viewed here.

Nominations for additional entries are now being solicited. Proposed items should be technical terms, terms of art, or idioms that recur with reasonable frequency on this site and that are likely to be unfamiliar to anyone not among the 14 billion regular readers of this blog.

Friday
Jan192018

Latest entries to CCP glossary thing: Science of #Scicomm; Rope-a-dope; and "From mouth of scientist. . . ."

Here are some more. At some point, I'll post the entire document & invite nominations for additional terms worthy of definition of explicatioin therein.

Science of science communication. A research program that uses science’s own signature methods of disciplined observation and valid causal inference to understand and manage the processes by which citizens come to know what is known by science. [Source: Oxford Hand of the Science of Science Communication, eds. K.H. Jamieson, D.M. Kahan & D.Scheufele, passim ;Kahan, J. Sci. Comm., 14(3) (2015). Added Jan. 19, 2018.]

From mouth of the scientist to ear of the citizen. A fallacious view that treats the words scientists utter as a causal influence on formation and reform of public opinion on controversial forms of science. The better view recognizes that what science knows is transmitted from scientists to the public via the influence of dense, overlapping networks of intermediaries, which include not just the media but (more decisively) individuals' peers, whose words & actions vouch for the science (or not) through their own use (or non-use) of scientific insights.  Where there is a science communication problem, then, the source of it is the corruption of these intermediary networks, not any problem with how scienitsts themselves talk. [Source: Kahan, Oxford Hand of the Science of Science Communication, eds. K.H. Jamieson, D.M. Kahan & D.Scheufele. Added: Jan. 19, 2018.]

Rope-a-dope. A tactic of science miscommunication whereby a conflict entrepreneur baits the communicators into fighting him or her in a conspicuous forum. The strength of the arguments advanced by the antagonists, the conflict entrepreneur realizes, are largely irrelevant. What matters is the appearance of of a social controversy, which cues onlookers to connect the competing positions with membership in and loyalty to members of their cultural group. Falling for this gambit marks science communicators as the miscommunicators’ “dope.” [Source: Cultural Cognition Project blog here & here. Added: Jan. 19, 2018.]

 

Wednesday
Jan172018

Comment function restored!

So let's hear what you think.

Wednesday
Jan172018

"Meet the Millennials!," part 3: climate change, evolution, and generational polarization

This is part 3 of CCP’s hit  series, “Meet the Millennials!”

In episode 1, we saw that the Millennials like to go to the zoo more often than do members of the Baby Boom, Generation X, and Silent Generation cohorts.

In episode 2, we observed that Millennials did better than other generational cohorts on a standardized test of science comprehension (the Ordinary Science Intelligence assessment), but were nevertheless no more science-curious than members of those other age brackets.we observed that Millennials did better than other generational cohorts on a standardized test of science comprehension (the Ordinary Science Intelligence assessment), but were nevertheless no more science-curious the members of those other age brackets.

Now, in what will be either the final or not the final episode of the series, we take a look at how Millennials fare in their beliefs in human-caused climate change and human evolution.

What do we see?  This on climate change,

 

and this on evolution.

Basically in relation to political outlooks Millennials are the group least polarized on climate change. Similarly, in relation to religiosity the Millennials are the least polarized on acceptance of human evolution. 

The difference in the degree of polarization, moreover, increase as the age differentials grow.  There’s not much difference between Millennials (born 1982-1999) and members of Generation X (1965-1981). But there is a decided difference between Millennials and Baby Boomers (1946-1964) and an even greater difference between Millennials and members of the Silent Generation (born before 1946).

I can think of two explanations.  The first is cohort shift: based on their experience and common exposure to social influences, the Millennials, while far from uniform in their assessments of controversial science-issues, are less worked up over them.  On this theory, we can expect a gradual, generational abatement of controversy over matters like climate change and evolution.

The second theory, however, is opinion shift: as they age, members of every generational cohort become more partisan and thus more divided on controversy-provoking forms of science.  There’s actually some literature to support this view, which I’ve commented on before.

Normally, at this point I’d say, “What do you think?” But thanks to Squarespace (which has admitted that it views older sites like this one as “low priority”), the CCP blog’s comment function is broken.

So tell you what: If you’d like to comment on this post, send me an email, and I’ll manually insert them into the comment field.  Use “Meet the Millennials!” in the re line so I can be sure to spot the messages and take the work-around steps necessary to let you be heard.

Tuesday
Jan162018

Couple more items for CC dictionary/glossary

Dual process theory/theories. A set of decisionmaking frameworks that posit two discrete modes of information processing: one (often referred to as “System 1”) that is rapid, intuitive, and emotion pervaded; and another (often referred to as “System 2”) that is deliberate, self-conscious, and analytical. [Sources: Kahan, Emerging Trends in the Social and Behavioral Sciences (2016); Kahneman, American Economic Review, 93(5), 1449-1475 (2003); Kahneman & Frederick in Morrison (Ed.), The Cambridge handbook of thinking and reasoning (pp. 267-293), Cambridge University Press. (2005); Stanovich & West, Behavioral and Brain Sciences, 23(5), 645-665 (2000). Added Jan. 12, 2018.]

Motivated reasoning. A form of unconscious information-processing that is characterized by the selective crediting and discrediting of evidence in patterns that advance some goal or interest independent of the apprehension of truth. Cultural cognition—the biased assessment of evidence  protective of one’s status in identity-defining affinity groups—is one form of motivated reasoning. But there are many others, including self-serving apprehension of one’s own abilities, and inattention to evidence of one’s own mortality. Accordingly, cultural cognition should be not be equated with motivated reasoning but rather be treated as a species of it. [Source: Kunda Psychological Bulletin, 108, 480-498; Kahan, Harv L. Rev., 126, 1-77 (2011), pp. 19-26. Added Jan. 15, 2016.]

* * *

Check out the entire dictionary/glossary document; it's getting pretty cool.

Monday
Jan152018

Aren't you curious to know how Millennials rate in science curiosity?!

Okay—so it was so easy to pretrodict that Millennials would be more likely than other age cohorts to visit a zoo in the last yr.

Well, try these:

(1)  Which age cohort displays greatest level of science comprehension?

The answer is … the Millennials!

Stands to reason given how often they visit the zoo, right?

Actually the margin isn’t particularly big—less than a third of a standard-deviation separates the Millennials from the Silent Generation, whose members had the lowest OSI_2.0 score.

(2) Does the edge that the Millennials enjoy in OSI mean that they are more science curious (as measured by the SCS scale) than members of other generations?

Nope:

Surprising?  Well, it shouldn’t be when we recall that Ordinary Science Intelligence is only modestly correlated with Science Curiosity. 

But maybe it should surprise us, given Millennials’ immersion in new communication technologies: they have a greater opportunity to form and nourish the desire to know how the technologies that surround them work....

Or maybe the immersion cuts the other way: due to the extraordinary advances in information technologies over the course of their adulthood,  “silent,” “Boomers” and GenX might have been expected to have a greater degree of awe than the Millennials, who’ve had that technology all around them their whole lives.

Well, "Everythig is obvious--once you know the answer," as they say.

Another thing to ponder  here is the platykurtic (low peak, fat tails) distribution for the Millennials’ responses to the OSI assessment. . . . How should that affect our inferences?

How about another:

Before you know the answer, try to guess this one: are Millennials more likely than are other age cohorts to accept that human beings have caused climate change? (Don’t Google to find out results from general public opinion pollsters).

Answer “tomorrow.”™

Saturday
Jan132018

Introducing .. the Millenials! They like to go to the zoo!

As part of the CCP Science of Science Filmmaking project, I've been digging around in our existing data sets trying to learn more about the propensities of Millennials. I will share now & again tid bits that seem worthy of note.

So ... here's one thing: the Millennials are more likely to go to the zoo than are members of other age cohorts:

Does this surprise you? Or did you know of course Millennials are more regular zoo goers (many of the older ones with children in tow, points out Loyal Listener Gaythia Weis).

In the future try to predict things like how the Millennials size up, say, on the Science Curiosity Scale or on the Ordinary Science Intelligence assessment and we'll find out how good a sense you really have for cohort-effects in relation to science communication.

For purposes of this and future entries, the age cohorts birth years are as follows:

Millennials: 1982-1999
Generation X:  1965-1981
Boomers: 1946-1964
Silent generation: before 1946

 

Friday
Jan122018

A few more glossary entries: dual process reasoning; bounded rationality thesis; and C^4

I haven't had time to finish my "postcard" from Salt Lake City but here are some more entries for the glossary to tide you over:

Dual process theory/theories. A set of decisionmaking frameworks that posit two discrete modes of information processing: one (often referred to as “System 1”) that is rapid, intuitive, and emotion pervaded; and another (often referred to as “System 2”) that is deliberate, self-conscious, and analytical. [Sources: Kahneman, American Economic Review, 93(5), 1449-1475 (2003); Kahneman & Frederick in Morrison (Ed.), The Cambridge handbook of thinking and reasoning (pp. 267-293), Cambridge University Press. (2005); Stanovich & West, Behavioral and Brain Sciences, 23(5), 645-665 (2000). Added Jan. 12, 2018.]

Bounded rationality thesis (“BRT”). Espoused most influentially by Daniel Kahneman, this theory identifies over-reliance on heuristic reasoning as the source of various observed deficiencies (the availability effect; probability neglect; hindsight bias; hyperbolic discounting; the sunk-cost fallacy, etc.) in human reasoning under conditions of uncertainty. Nevertheless, BRT does not appear to be the source of cultural polariation over societal risks. On the contrary, such polarization has in various studies been shown to be the greatest in the individuals most disposed to resist the errors associated with heuristic information processing. [Sources: Kahan, Emerging Trends in the Social and Behavioral Sciences (2016); Kahneman, American Economic Review, 93(5), 1449-1475 (2003); Kahneman & Frederick in Morrison (Ed.), The Cambridge handbook of thinking and reasoning (pp. 267-293), Cambridge University Press. (2005); Kahneman, Slovic, & Tversky, A., Judgment Under Uncertainty: Heuristics and Biases, Cambridge ; New York: Cambridge University Press (1982). Added Jan. 12, 2018].

Cross-cultural cultural cognition (“C4”). Describes the use of the Cultural Cognition Worldview Scales to assess risk perceptions outside of the U.S. So far, the scales have been used in at least five nations other nations (England, Austria, Norway, Slovakia, and Switzerland). [CCP Bog, passim. Added Jan. 12, 2018.]

 

Thursday
Jan112018

"Kentucky farmer" spotted in Montana

This site's 14 billion regular subscribers know the Kentucky Farmer as one of the types of people whose habits of mind feature cognitive dualism--the tendency to adopt one set of action-enabling beliefs in one setting and another, opposing set of action-enabling beliefs in another. For Kentucky Farmer, this style of reasoning helps him to maintain his membership in a cultural group for whom climate-change skepticism is identity-defining while also using scientific information on climate change to be a good farmer.

Well, he was cited recently, not in Kentucky but in Montana.  The reporter for a story on the detrimental impact of climate change on barley farming is the one who spotted him:

In the field, looking at his withering crop, Somerfeld was unequivocal about the cause of his damaged crop – “climate change.” But back at the bar, with his friends, his language changed. He dropped those taboo words in favor of “erratic weather” and “drier, hotter summers” – a not-uncommon conversational tactic in farm country these days.

Great #scicomm by Ari LeVaux, the reporter.

But of course this form of information processing remains tinged with mystery.

Wednesday
Jan102018

Applying the Science of Science Communication

I’m giving a talk tomorrow on motivated numeracy at the University of Utah.  In the very generous allotment of time they’ve afforded me (30 mins or so), I should be able to make pretty good progress in showing why cultural cognition is not attributable to some defect in individual rationality. 

But I’ll still end up with things that I don’t have time to work in. Like the biased processing of information on whether one’s cultural adversaries process political information in a biased fashion. And the role curiosity can play in buffering the magnification of biased information processing associated with greater cognitive proficiency.

I’m sure many of you have experienced this sort of frustration, too.

Well, here’s how I plan to overcome this obstacle.  Likely you’ve seen salespersons at retail outlets wearing colorful “Ask me about . . .” buttons to promote prospective buyers’ awareness of and interest in some new product or service. 

So why shouldn’t academics do the same thing?

Consider:

 

I won’t be wearing these “buttons”—I didn’t have time to make them  before I left home.   But I will insert them into my slides at the point at which I allude to the relevant studies.  Then, I figure, someone—his or her open-minded curiosity aroused-- will surely “ask me!” about these ideas in the Q&A!

See how knowing about the science of science communication helps to promote effective communication of scientific data?

I'll write  back tomorrow to report how effective this device was

Tuesday
Jan092018

Stupid smart phone or brilliant handgun? You make the call (so to speak)

Who do you think will fear this "smart-phone-disguised" handgun, who won't, & why? 

I have my own hypothesis, of course, but am eager to hear what others think.

Or maybe the existence of this gun/phone is "fake news"?...

 

Monday
Jan082018

Science communication environment; toxic memes; and politically motivated reasoning paradigm

Some more for Glossary. Arranged conceptually, not alphabetically.

Science communication environment and science communication environment “pollution.” To flourish, individuals and groups need to make use of more scientific insight than they have either the time or capacity to verify.  Rather than become scientific experts on myriad topics, then, individuals become experts at recognizing valid scientific information and distinguishing it from invalid counterfeits of the same. The myriad cues and related influences that individuals use to engage in this form of recognition form their scientific communication environment.  Dynamics that interfere with or corrupt these cues and influences (e.g., toxic memes and politically motivated reasoning) can be viewed as science-communication-environment “pollution.” [Source: Kahan in Oxford Handbook of Science of Science Communication, Eds. Jamieson, Kahan & Scheufele) pp, 35-50 (2017); Kahan, Science, 332, 53-54 (2013). Added Jan. 8, 2018.]

Toxic memes. Recurring tropes and idioms, the propagation of which (usually at first by conflict entrepreneurs) fuses diverse cultural identities to opposing position on some form of decision-relevant science. In the contaminated science communication environment that ensues, individuals relying on the opinion of their peers—generally a successful strategy for figuring out what science knows—polarize rather than converge on the best possible evidence. [Source: Kahan, Scheufele & Jamieson, Oxford Handbook on the Science of Science Communication, Introduction (2017); Kahan, Jamieson et al. J. Risk Res., 20, 1-40 (2017). Added: Jan. 7, 2018.]

Politically motivated reasoning paradigm (“PMRP”) and the PMRP design. A model of the tendency of individuals of diverse identities to polarize when exposed to evidence on a disputed policy-relevant science issue.  Starting with a truth-seeking Bayesian model of information processing, the PMRP model focuses on the disposition of individuals of diverse identities to attribute opposing likelihood ratios to evidence; this mechanism would assure that individuals of diverse identities will not converge but rather become more sharply divided when they process information. The PMRP method refers to study designs suited for observing this dynamic if it in fact exists. [Source: Kahan, D. M. in Emerging Trends in the Social and Behavioral Sciences (2016). Added: Jan. 8, 2018.]