follow CCP

Recent blog entries
popular papers

What Is the "Science of Science Communication"?

Climate-Science Communication and the Measurement Problem

Ideology, Motivated Cognition, and Cognitive Reflection: An Experimental Study

'Ideology' or 'Situation Sense'? An Experimental Investigation of Motivated Reasoning and Professional Judgment

A Risky Science Communication Environment for Vaccines

Motivated Numeracy and Enlightened Self-Government

Ideology, Motivated Cognition, and Cognitive Reflection: An Experimental Study

Making Climate Science Communication Evidence-based—All the Way Down 

Neutral Principles, Motivated Cognition, and Some Problems for Constitutional Law 

Cultural Cognition of Scientific Consensus
 

The Tragedy of the Risk-Perception Commons: Science Literacy and Climate Change

"They Saw a Protest": Cognitive Illiberalism and the Speech-Conduct Distinction 

Geoengineering and the Science Communication Environment: a Cross-Cultural Experiment

Fixing the Communications Failure

Why We Are Poles Apart on Climate Change

The Cognitively Illiberal State 

Who Fears the HPV Vaccine, Who Doesn't, and Why? An Experimental Study

Cultural Cognition of the Risks and Benefits of Nanotechnology

Whose Eyes Are You Going to Believe? An Empirical Examination of Scott v. Harris

Cultural Cognition and Public Policy

Culture, Cognition, and Consent: Who Perceives What, and Why, in "Acquaintance Rape" Cases

Culture and Identity-Protective Cognition: Explaining the White Male Effect

Fear of Democracy: A Cultural Evaluation of Sunstein on Risk

Cultural Cognition as a Conception of the Cultural Theory of Risk

« If you think GM food & vaccine risk perceptions have any connection to the "climate change risk perception" family, think again | Main | Developing effective vaccine-risk communication strategies: *Definitely* measure, but measure what *counts* »
Friday
Mar072014

Q. Where do cultural predispositions come from in the cultural cognition theory? A. They are exogenous -- descriptivey & *normatively*!

A thoughtful friend & corrspondent asks:

The question that you must have been asked many times is, ultimately, how do people obtain their cultural orientations?


If I read between the lines, part of the answer seems to be that these orientations are seeded by the people we associate with and the authorities we seek — perhaps by chance. After that seed is planted, then it becomes a self-reinforcing process: We continue to seek like-minded company and authorities, which strengthens the orientations, and the cycle continues. But there must be more to it than that. Genetics? Some social or cultural adaptive process? I'd like to say something about how we arrive at our cultural orientations.

My response:

I think the model/process you describe is pretty much right. I'd say, though, that the cycle -- the seeking out, the reinforcement -- is not the problem; indeed, it's part of the solution to the puzzle of what makes it possible for people (diverse ones, ones who can't just be told what's what by some central authority) reliably to identify what's collectively known. They immerse themselves in networks of people who they can understand and are motivated to engage and cooperate with, and use their rational faculties to discern inside of those affinity groups who genuinely knows what about what (that is, who knows what's known to science). When this process short circuits & becomes a source of self-reinforcing dissensus, that is a sign not that the process is pathological but that a pathology has infected the process, disabling our normal and normally reliable rational capacity to figure out what's known.

However, we notice the cultural insularity of our process for figuring things out only then & infer "there's a problem w/ the insularity & self-reinforcement!" But that's a kind of selection bias in our attention to such things; we are observing the process only when it is failing in a spectacular way; if we paid attention to the billions of boring cases where diverse people agree, we'd see the same insularity in the process by which diverse people figure things out.  Then we'd properly infer that the problem is not the process but some external condition that corrupts it. At that point, we would focus our reason, guided by the methods of empirical inquiry, to figure out the dynamics of the pathology -- and ultimately to control them...   

You then ask me -- where do these affinities that are the source of the predispositions (the enviroment in which we figure out what's what) come from.  I don't know!  

Or I think likely I more or less know & the answer isn't *that* interesting: we are socialized into them by the accident of who are parents are & where we live.  That's the uninspiring "descriptive" account.  

A more inspiring normative answer (maybe it's just a story? but it has the right moral, morally speaking) is this: we are autonomous, reasoning agents in a free society; it is inevitable that we will we form a plurality of understandings of the best way to live.  That isn't the problem; it's the political way of life to be protected.  So let's take our cultural plurality as given & "solve" the "science communication problem" by removing the influences that conduce to dissensus & polarization, & that disrupt the usual consensus & convergence of free & reasoning citizens on the best (currently) available evidence....'

Some perhaps relevant posts (best I can do, until you help me): 

But I will invite others readers of this blog to comment--likely they can do better!

<

PrintView Printer Friendly Version

EmailEmail Article to Friend

Reader Comments (5)

I see two causal factors.

The first is psychological. We are born into social group with certain fundamental identifying philosophies or moral constructs. We create our personal identity against that background. Our personal identity might be highly congruent with those societal norms (typically the case) or it might be less congruent (which would be the case with "outliers"). Once our identity is formed, we protect it. Part of the way that we protect it is by seeking to confirm evidence that our identity is "right," or "moral," or intelligent enough to evaluate evidence correctly. So we seek that kind of confirmation when we look at issues that stimulate our sense of our own identity. Part of the way that we protect it is by identifying "others," who are not like us. Identifying an "other" increases my sense of my self. What I'm describing is not mutually exclusive with the dynamic described above, but the individual identity protection (and aggression) is more central, and the group identity protection (and aggression) is more secondary, or a by-product of sorts.

The second is cognitive. We reason by finding patterns. We evaluate all evidence for patterns. One pattern, for example, is that if a large prevalence of smart and knowledgeable people believe something, it is more likely to be correct. Another pattern is that the prevalence of opinion among smart and knowledgeable people can lead to a mistaken confidence about the truth of a matter. When we look at a given issue, we seek to understand that issue by finding patterns - so we have a tendency to actively shape our perceptions of the evidence into patterns. Sometimes we shape it one way - say into the "Everyone should do what they can to pay for their healthcare, or otherwise they are a moocher" pattern. And at other times we shape it into the "Having a insurance mandate where everyone does what they can to pay for their healthcare is the worst from of tyranny" pattern.

The cognitive aspect underlies the psychological aspect.

March 7, 2014 | Unregistered CommenterJoshua

Another point worth noting is that the 'pathological' mechanisms that affects our ability to "reliably" tell what is known may be more common than we think. We only notice them when they affect us *differently*, so some people go one way and some go another. If we *all* took the same wrong path, nobody would notice. There would be no disagreements or fights between competing interpretations, nobody looking for weaknesses in the paradigm.

The basic problem is that the methods most people use are not reliable. They are heuristics that have value because they *often* work, but they offer no guarantees. There is a relatively high failure rate. Because many of these methods rely on trusting other people to check, but who are themselves trusting other people, you tend to get self-reinforcing networks of 'groupthink'. Sometimes it spreads to the whole network. Sometimes it breaks up into 'domains', each with its own orientation. There's a lot of maths gone into this, studying crystal growth and magnetisation and phase changes, that work that way. There may be some way to apply that body of theory to this problem.

However, given that the methods we use *are* unreliable, it's not obvious that having everyone jump the same way is a good idea. There are advantages to everybody being right, but they are potentially outweighed by the penalties of everybody being wrong. An alternative approach is to *encourage* a diversity of viewpoint, so that while you always get some societal costs from a subset having taken the wrong path, you do get to explore a much wider range of options, can allow them to compete, and each of them critically examines all the others looking for flaws, and pointing them out. Composites materials are often tougher than pure ones, because each component with its own strengths and weaknesses complements all the others. Single crystals are stronger in some ways, but tend to be brittle. When they fail, they fail catastrophically; while disordered materials only fail up to the next domain boundary. Composites are more flexible and resilient, and capable of adapting to changing stresses.

As for the sources of influence, there are only partial lists. People are affected by the culture they grow up in, by their friends, family, colleagues. By what books they read or enjoy, by their hobbies and likes and dislikes, and how these fit with society. Some of it is probably genetic or developmental. Some is due to particularly memorable or influential life experiences. Some will depend on whether you have particular talents, or whether the people you know do. It might depend on what accent you have, or your sex, or your weight, or hair colour - affecting how people treat you.

And there is a lot of influence from society's deliberate attempts to encourage cohesion: education, propaganda, law enforcement, clubs and societies, advertising campaigns, morals, subsidies and tariffs and regulation. Or rebellion and resistance to the same.

Like the weather, there are likely so many different influences, any of which may be decisive, most of which are unknowable, that there is little hope for any sort of simple model. It may be fundamentally unpredictable, and effectively random. But I don't know.

March 7, 2014 | Unregistered CommenterNiV

@NiV:

definitely there are additional mechanisms/influences that interfere with the ordinary & ordinarily very reliable rational faculties we use to recognize "what's known." They certainly interfere more often than we realize -- b/c it is in their nature to evade our consciousness of their pernicious effects as they occur! But we do, as you know, have knowledge of the existence & dynamics of quite a good number of these reason confounders & some knowledge -- even more would be nice-- about how to protect ourselves from them.

I'm glad you made the point -- it should go w/o saying but b/c I don't say it there is a risk it will be overlooked -- that the particular set of reason-effacing influences I'm concerned w/o aren't the only ones out there or the only ones we should worry about.

March 8, 2014 | Unregistered Commenterdmk38

@Joshua--

I think yours is a compelling & useful synthesis. Related reflects on my part here.

It is useful & interesting to think further about the distinctive qualities & relative importance of the 2 processes as well as their interaction.

March 8, 2014 | Registered CommenterDan Kahan

Dan -

Yes, I think that it is useful to consider the different dynamics that you outline in that post..

Part of what I go back to is how I see "motivated reasoning" in myself all the time - when I'm arguing with my partner, for example.

I'm always catching myself reasoning in ways, reshaping reality to confirm my biases that I'm right and she's wrong w/r/t a whole host of issues. I will even sometimes completely reshape my memory of what she said and what I said so as to create the pattern that makes me "right" and her "wrong." I think of it as a cognitive parallel of the McGurk effect - where there are competing influences that shape what patterns I "see.". In that inter-personal context, my motivated reasoning has little, if anything at all, to do with group identity or world views, although it has a lot to do with identity protection.

March 8, 2014 | Unregistered CommenterJoshua

PostPost a New Comment

Enter your information below to add a new comment.

My response is on my own website »
Author Email (optional):
Author URL (optional):
Post:
 
Some HTML allowed: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <code> <em> <i> <strike> <strong>