Nullius in verba? Surely you are joking, Mr. Hooke! (or Why cultural cognition is not a bias, part 1)
Tuesday, June 26, 2012 at 9:31AM
Dan Kahan

Okay, this is actually the first of two posts on the question, “Is cultural cognition a bias?,” to which the answer is “well, no, actually it’s not. It’s an essential component of human rationality, without which we’d all be idiots.”

But forget that for now, and consider this:


Nullius in verba means “take no one’s word for it.”

It’s the motto of the Royal Society, a truly remarkable institution, whose members contributed more than anyone ever to the formation of the distinctive, and distinctively penetrating, mode of ascertaining knowledge that is the signature of science.

The Society’s motto—“take no one’s word for it!”; i.e., figure out what is true empirically, not on the bias of authority—is charming, even inspiring, but also utterly absurd.

“DON’T tell me about Newton and his Principia Naturalis,” you say, “I’m going to do my own experiments to determine the Law of Gravitation.”

“Shut up already about Einstein! I’ll point my own telescope at the sun during the next lunar eclipse, place my own atomic clocks inside of airplanes, and create my own GPS system to ‘see for myself’ what sense there is in this relativity business!’ ”

“Fsssssss—I don’t want to hear anything about some Heisenberg’s uncertainty principle. Let me see if it is possible to determine the precise position and precise momentum of a particle simultaneously.”

After 500 years of this, you’ll be up to this week’s Nature, which will at that point be only 500 years out of date.

But, of course, if you “refuse to take anyone’s word for it,” it’s not just your knowledge of scientific discovery that will suffer. Indeed, you’ll likely be dead long before you figure out that the earth goes around the sun rather than vice versa.

If you think you know that antibiotics kill bacteria, say, or that smoking causes lung cancer because you have confirmed these things for yourself, then take my word for it, you don’t really get how science works. Or better still, take Popper’s word for it; many of his most entertaining essays were devoted to punching holes in popular sensory empiricism—the attitude that one has warrant for crediting only what one “sees” with one’s own eyes.

The amount of information it is useful for any individual to accept as true is gazillions of times larger the amount she can herself establish as true by valid and reliable methods (even if she cheats and takes the Royal Society’s word for it that science’s methods for ascertaining what’s true are the only valid and reliable ones).

This point is true, moreover, not just for “ordinary members of the public.” It goes for scientists, too.

In 2011, three physicists won the Nobel Prize “for the discovery of the accelerating expansion of the Universe through observations of distant supernovae.” But the only reason they knew what they (with the help of dozens of others who helped collect and analyze their data) were “observing” in their experiments even counted as evidence of the Universe expanding was that they accepted as true the scientific discoveries of countless previous scientists whose experiments they could never hope to replicate—indeed, whose understanding of why their experiments signified anything at all these three didn’t have time to acquire and thus simply took as given.

Scientists, like everyone else, are able to know what is known to science only by taking others’ words for it.  There’s no way around this. It is a consequence of our being individuals, each with his or her own separate brain.

What’s important, if one wants to know more than a pitiful amount, is not to avoid taking anyone’s word for it. It’s to be sure to “take it on the word” of  only those people who truly know what they are talking about.

Once this point is settled, we can see what made the early members of the Royal Society, along with various of their contemporaries on the Continent, so truly remarkable. They were not epistemic alchemists (although some of them, including Newton, were alchemists) who figured out some magical way for human beings to participate in collective knowledge without the mediation of trust and authority.

Rather their achievement was establishing that the way of knowing one should deem authoritative and worthy of trust is the empirical one distinctive of science and at odds with those characteristic of its many rivals, including divine revelation, philosophical rationalism, and one or another species of faux empiricism.

Instead of refusing to take anyone's word for it, the early members of the Royal Society retrained their faculties for recognizing "who knows what they are talking about" to discern those of their number whose insights had been corroborated by science’s signature way of knowing.

Indeed, as Steven Shapin has brilliantly chronicled, a critical resource in this retraining was the early scientists’ shared cultural identity.  Their comfortable envelopment in a set of common conventions helped them to recognize among their own number those of them who genuinely knew what they were talking about and who could be trusted—because of their good character—not to abuse the confidence reposed in them (usually; reliable instruments still have measurement error).

There’s no remotely plausible account of human rationality—of our ability to accumulate genuine knowledge about how the world works—that doesn’t treat as central individuals’ amazing capacity to reliably identify and put themselves in intimate contact with others who can transmit to them what is known collectively as a result of science.

Now we are ready to return to why I say cultural cognition is not a bias but actually an indispensable ingredient of our intelligence.

Part 2

Article originally appeared on cultural cognition project (http://www.culturalcognition.net/).
See website for complete article licensing information.