follow CCP

Recent blog entries
popular papers

Science Curiosity and Political Information Processing

What Is the "Science of Science Communication"?

Climate-Science Communication and the Measurement Problem

Ideology, Motivated Cognition, and Cognitive Reflection: An Experimental Study

'Ideology' or 'Situation Sense'? An Experimental Investigation of Motivated Reasoning and Professional Judgment

A Risky Science Communication Environment for Vaccines

Motivated Numeracy and Enlightened Self-Government

Making Climate Science Communication Evidence-based—All the Way Down 

Neutral Principles, Motivated Cognition, and Some Problems for Constitutional Law 

Cultural Cognition of Scientific Consensus
 

The Tragedy of the Risk-Perception Commons: Science Literacy and Climate Change

"They Saw a Protest": Cognitive Illiberalism and the Speech-Conduct Distinction 

Geoengineering and the Science Communication Environment: a Cross-Cultural Experiment

Fixing the Communications Failure

Why We Are Poles Apart on Climate Change

The Cognitively Illiberal State 

Who Fears the HPV Vaccine, Who Doesn't, and Why? An Experimental Study

Cultural Cognition of the Risks and Benefits of Nanotechnology

Whose Eyes Are You Going to Believe? An Empirical Examination of Scott v. Harris

Cultural Cognition and Public Policy

Culture, Cognition, and Consent: Who Perceives What, and Why, in "Acquaintance Rape" Cases

Culture and Identity-Protective Cognition: Explaining the White Male Effect

Fear of Democracy: A Cultural Evaluation of Sunstein on Risk

Cultural Cognition as a Conception of the Cultural Theory of Risk

« The fourth of "four theses on ordinary science intelligence" ... a fragment | Main | On the Sources of Ordinary Science Knowledge and Ignorance (new paper) »
Tuesday
Jun142016

Two of "Four Theses on Ordinary Science Knowledge" . . . a fragment

From On the Sources of Ordinary Science Knowledge and Ignorance . . .

I. “Individuals must accept as known more decision relevant science (DRS) than they can possibly understand or verify for themselves.”

The motto of the Royal Society is Nullius in verba, which translates literally into “take no one’s word for it.” But something—namely, any pretense of being a helpful guide to getting the benefits of scientific knowledge—is definitely lost in a translation that literal.

If you aren’t nodding your head violently up and down, then consider this possibility. You learn next week that you have endocrinological deficit that can be effectively treated but only if you submit to a regimen of daily medications.  You certainly will do enough research to satisfy yourself—to satisfy any reasonable person in your situation—that this recommendation is sound before you undertake such treatment.

But what will you do? Will you carefully read and evaluate all the studies that inform your physician’s recommendation? If those studies refer, as they inevitably will, to previous ones the methods of which aren’t reproduced in those papers, will you read those, too? If the studies you read refer to concepts with which you aren’t familiar, or use methods which you have no current facility, will you enroll in a professional training program to acquire the necessary knowledge and skills? And once you’ve done that, will you redo the experiments—all of them; not just the ones reported in the papers that support the prescribed treatment but in any those studies relied on and extended—so you can avoid taking anyone’s word on what the results of such studies actually were as well?

Of course not. Because by the time you do those things, you’ll be dead. To live well—or just to live—individuals (including scientists) must accept much more DRS than they can ever hope to make sense of on their own.

Science’s way of knowing involves crediting as true only inferences rationally drawn from observation. This was—still is—a radical alternative to other ways of knowing that feature truths revealed by some mystic source to a privileged few, who alone enjoy the authority to certify the veracity of such insights. That system is what the founders of the Royal Society had in mind when they boldly formulated their injunction to “take no one’s word for it.” But it remains the case that to get the benefits of the distinctive, and distinctively penetrating, mode of ascertaining knowledge they devised, we must take the word of those who know what’s been ascertained by those means—while being sure not to take the word of anyone else (Shapin 1994).

II. “Individuals acquire the insights of DRS by reliably recognizing it.”

But how exactly does one do that? How do reasonable, reasoning people who need to use science for an important decision but who cannot plausibly figure out what science knows for themselves figure out who does know what science knows?

We can rule out one possibility right away: that members of the public figure out who genuinely possesses knowledge of what science knows by evaluating the correctness of what putative experts believe. To do that, members of the public would have to become experts in the relevant domain of knowledge themselves. We have already determined (by simply acknowledging the undeniable) that they lack both the capacity and time to do that.

Instead they have to become experts at something else: recognizing valid sources of science. They become experts at that, moreover, in the same way they become experts at recognizing anything else: by using a conglomeration of cues, which operate not as necessary and sufficient conditions, but as elements of prototypical representations (“cat,” “advantageous chess position,” “ice cream sandwich,” “expert”) that are summoned to mind by mental processes, largely unconscious, that rapidly assimilate the case at hand to a large inventory of prototypes acquired through experience.  In a word (or two words), they use pattern recognition (Margolis 1993).

This is equivalent to the answer that Popper gave (in an essay the title, and much more, of which are the inspiration for this one) in answering the near-identical question about how we come to know what is known by science. Popper’s target was a cultural trope of sensory empiricism that treated as “scientific knowledge” only that which one has observed for oneself. After impaling this view on the spear tips of a series of reductios, Popper explains that most things we know”—i.e., know to be known to science—“we have learnt by example, by being told.” In appraising the conformity of any such piece of information to the qualities that invest it with the status of scientific knowledge, moreover, an individual must rely on “his knowledge of persons, places, things, linguistic usages, social conventions, and so on” (ibid., p. 30).

To be sure, powers of critical reasoning play a role. We must calibrate this facility of recognition by “learning how to criticize, how to take and to accept criticism, how to respect truth” (ibid, p. 36), a view Baron (1993) and Keil (2012) both develop systematically.

But the objects of the resulting power to discern valid science are not the qualities that make it valid: those are simply far too “complex,” far too “difficult for the average person to understand (Baron, 1993, p, 193). What this faculty attends to instead are the signifiers of validity implicit in informal, everyday social processes that vouch for the good sense of relying on the relevant information in making important decisions (Keil 2010, 2012). Popper characterizes the aggregation of these processes as “tradition,” which he describes as “by far the most important source of our knowledge” (1962b, p. 36).

It is worth noting that although Popper here is referring to the process by which ordinary science knowledge disseminates to nonscientists, there is no reason to think that scientists are any less in need of a valid-knowledge recognition capacity, or that they acquire or exercise it in a fundamentally different way. Indeed, there is ample reason to think that it couldn’t possibly differ from the faculty that members of the public use to recognize valid science (Shapin 1994) aside from its being more finely calibrated to the particular insights and  methods needed to be competent in the production of the same (Margolis 1987, 1996).

 “How do we gain our knowledge about how to analyze data?” ask Andrew Gelman and Keith O’Rourke (2015, pp., 161-2).  By “informal heuristic reasoning,” they reply, of the sort that enables those immersed in a set of practice to see the correctness of an answer to a problem before, and often without ever being able to give a fully cogent account of, why.

References

Baron, J. Why Teach Thinking? An Essay. Applied Psychology 42, 191-214 (1993).

Gelman, A. & O’Rourke, K. Convincing Evidence. in Roles, Trust, and Reputation in Social Media Knowledge Markets: Theory and Methods (ed. E. Bertino & A.S. Matei) 161-165 (Springer International Publishing, Cham, 2015).

Keil, F.C. Running on Empty? How Folk Science Gets By With Less. Current Directions in Psychological Science 21, 329-334 (2012).

Keil, F.C. The feasibility of folk science. Cognitive science 34, 826-862 (2010).

Margolis, H. Patterns, thinking, and cognition : a theory of judgment (University of Chicago Press, Chicago, 1987).

Margolis, H. Paradigms and Barriers (University of Chicago Press, Chicago, 1993).

Margolis, H. Dealing with risk : why the public and the experts disagree on environmental issues (University of Chicago Press, Chicago, IL, 1996).

Popper, K.R. On the Sources of Knowledge and of Ignorance. in Conjectures and Refutations 3-40 (Oxford University Press London, 1962b). 

Shapin, S. A social history of truth : civility and science in seventeenth-century England (University of Chicago Press, Chicago, 1994).

 

 

PrintView Printer Friendly Version

EmailEmail Article to Friend

Reader Comments (5)

I want to think about how the pollution of the scientific communication environment, as the spread of conflicting cultural heuristics of information valuation, or IPC more broadly, compares with Popper's concern about the "babble of tongues" within the scientific discourse. This particularly in the context of ongoing slow-boil methods conflicts in several fields- arguably cases where the finely tuned, but unexamined, heuristics employed by scientists to recognize knowledge are similarly producing conflict.

While I think about that, a bug report- you appear to be missing some content in the first sentence of the paragraph starting "It is worth noting that". This is also present in the SSRN copy.

Also in the SSRN copy, the paragraph beginning "If these claims are right" appears to be missing a word between "value" and "science". The last paragraph is also missing the word "of" after "power".

June 14, 2016 | Unregistered CommenterRobert Marriott

I don't buy IV. Looking forward to the second half of this series.

June 14, 2016 | Unregistered Commenterdypoon

@Robert-- thank you! For the bug report & for the thoughts about KP

June 14, 2016 | Registered CommenterDan Kahan

@Dypoon-- will you buy after I make fixes proposed by @Robert?

June 14, 2016 | Registered CommenterDan Kahan

I believe that the important thing about science is that it is a process that is a perpetual work in progress. This process is highly imperfect, but also self correcting. It has the ability to improve conclusions despite the fact that, in fact, the four listed “false starts” are actually true much of the time. The public can be irrational, scientists can be highly partisan and working off in obscure and ultimately insignificant corners of knowledge, denialism both real and as an affectation driven by ulterior motives are quite prevalent, and much of the time the public IS being manipulated by those in positions of power and the resulting sense of authority.

Medicine, as discussed in the example above makes for good case studies of public interactions with science, because this topic is often of direct interest to individuals and the public at large. As would be the case of the endocrinological ailment above.

But the public also has plenty of experience with issues presented as medical fact which later turned out to be debunked by later discoveries. So, the best that can be said about the prescribed drug treatment is that it may be the method most likely to be successful, based on the best available knowledge of this time, and thus well worth trying.

I believe that this statement comes across as too authoritarian: “But it remains the case that to get the benefits of the distinctive, and distinctively penetrating, mode of ascertaining knowledge they devised, we must take the word of those who know what’s been ascertained by those means—while being sure not to take the word of anyone else (Shapin 1994).” And the conclusion fails to highlight a process that is an actual scientific method: ““How do we gain our knowledge about how to analyze data?” ask Andrew Gelman and Keith O’Rourke (2015, pp., 161-2).  By “informal heuristic reasoning,” they reply, of the sort that enables those immersed in a set of practice to see the correctness of an answer to a problem before, and often without ever being able to give a fully cogent account of, why.” Actual scientists pursue investigations and collect data for all sorts of strange original rationales. The key is how they process that information and whether or not they are open to new interpretations, for which the ultimate “correctness of an answer” is not a goal but on the other hand, at least an incomplete but cognet account of why it might be, and what additional investigations would support that conclusion, ought to be the objective.

Atul Gawande, MD http://atulgawande.com/about/, author of an excellent book on the American way of dying: Being Mortal, has published his MIT commencement speech here: The Mistrust of Science http://www.newyorker.com/news/news-desk/the-mistrust-of-science.

From that article:

“Seen up close, the scientific community—with its muddled peer-review process, badly written journal articles, subtly contemptuous letters to the editor, overtly contemptuous subreddit threads, and pompous pronouncements of the academy— looks like a rickety vehicle for getting to truth. Yet the hive mind swarms ever forward. It now advances knowledge in almost every realm of existence—even the humanities, where neuroscience and computerization are shaping understanding of everything from free will to how art and literature have evolved over time.”

“The mistake, then, is to believe that the educational credentials you get today give you any special authority on truth. What you have gained is far more important: an understanding of what real truth-seeking looks like. It is the effort not of a single person but of a group of people—the bigger the better—pursuing ideas with curiosity, inquisitiveness, openness, and discipline. As scientists, in other words.”

Atul Gawande is imploring that the graduates actively become members of the community of scientists.

As he concludes:

“Even more than what you think, how you think matters. The stakes for understanding this could not be higher than they are today, because we are not just battling for what it means to be scientists. We are battling for what it means to be citizens.”

I think that IV above, pollution of the science communication environment, is a huge problem. One of the manifestations of that problem is a tendency of scientists themselves to get backed into corners of absolutism rather than humility as to what is scientifically known and what the unknown unknowns are likely to be. Another is that there really are those with vested self interests that deliberately want to mislead. All of which play into how to approach the idea of "decision ready science".

This makes I, II and III difficult in ways that deserve more discussion. How is it that our conversations as citizens can be driven from approaches that recognize and utilize decision ready science?

From a paper which I reached from a tweet by Dan Kahan: "Belief polarization is said to occur when two people respond to the same evidence by updating their beliefs in different directions" http://onlinelibrary.wiley.com/doi/10.1111/tops.12186/abstract (But Dan might have meant this one, which also contains those two authors also in 2016: http://iopscience.iop.org/article/10.1088/1748-9326/11/4/048002/meta, which I haven't read).

I'd add that this quote actually also applies to groups of people, and that they are not necessarily looking at the same evidence. Or at the very least said evidence is being presented in very different contexts. An important part of what this means is that once someone has staked out a side, there is a real temptation for those who disagree to stake out an increasingly oppositional side. So the problem with a debate that gets narrowly defined, perhaps by clever maneuvering by opponents, is that the sorts of broad based and inclusive of new ideas discussions that enable the furthering of good science may fail to happen.

Special interests can take advantage of this phenomena by driving or limiting public conversations on a topic by provoking attacks to retain publicity, or by focusing attacks on “those nut cases way over there” as a means of limiting what otherwise might be discussions covering a broader base of possibilities and concerns. Such tactics can also work as diversion. This past weekend, my visit to a local garden center was met with numerous signs advertising that their plants were from “GMO free seed”. This message, intended to make customers feel good, was, IMHO, a means of not discussing something actually relevant, the much, much smaller labels that plants had been treated with neonicotinoid pesticides. Figuring out the relevant information is not always easy.

I agree that "We can rule out one possibility right away: that members of the public figure out who genuinely possesses knowledge of what science knows by evaluating the correctness of what putative experts believe." I think we need to look to approach. Are those we wish to utilize as experts themselves seemingly assembling and weighing the available evidence carefully and reaching conclusions that are both actionable in present time and yet still open to future re-interpretations?

For the endocrine patient here, assuming that this is "cutting edge" medicine, the discussions with presumably knowledgeable medical professionals are quite likely to lead to conclusions as to what the drug regime is likely to do and why it may be the best currently available course of action. While still knowing that taking their word for it, even if it is a currently consensus position, is an incomplete version of the truth as it is likely to be known in the future. Deciding who to trust ought to be a process of recognizing who is acting as an engaged member of the truth seeking community of science.

June 15, 2016 | Unregistered CommenterGaythia Weis

PostPost a New Comment

Enter your information below to add a new comment.

My response is on my own website »
Author Email (optional):
Author URL (optional):
Post:
 
Some HTML allowed: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <code> <em> <i> <strike> <strong>