follow CCP

Recent blog entries
popular papers

Science Curiosity and Political Information Processing

What Is the "Science of Science Communication"?

Climate-Science Communication and the Measurement Problem

Ideology, Motivated Cognition, and Cognitive Reflection: An Experimental Study

'Ideology' or 'Situation Sense'? An Experimental Investigation of Motivated Reasoning and Professional Judgment

A Risky Science Communication Environment for Vaccines

Motivated Numeracy and Enlightened Self-Government

Ideology, Motivated Cognition, and Cognitive Reflection: An Experimental Study

Making Climate Science Communication Evidence-based—All the Way Down 

Neutral Principles, Motivated Cognition, and Some Problems for Constitutional Law 

Cultural Cognition of Scientific Consensus
 

The Tragedy of the Risk-Perception Commons: Science Literacy and Climate Change

"They Saw a Protest": Cognitive Illiberalism and the Speech-Conduct Distinction 

Geoengineering and the Science Communication Environment: a Cross-Cultural Experiment

Fixing the Communications Failure

Why We Are Poles Apart on Climate Change

The Cognitively Illiberal State 

Who Fears the HPV Vaccine, Who Doesn't, and Why? An Experimental Study

Cultural Cognition of the Risks and Benefits of Nanotechnology

Whose Eyes Are You Going to Believe? An Empirical Examination of Scott v. Harris

Cultural Cognition and Public Policy

Culture, Cognition, and Consent: Who Perceives What, and Why, in "Acquaintance Rape" Cases

Culture and Identity-Protective Cognition: Explaining the White Male Effect

Fear of Democracy: A Cultural Evaluation of Sunstein on Risk

Cultural Cognition as a Conception of the Cultural Theory of Risk

« Distrust of "trust in science" measures--crisis solved? | Main | Liberals trust in Supreme Court plummets! Less than 25% of them would agree to have Steve Breyer housesit for them when they want on vacation!e »
Friday
Nov282014

Group conflict and risk perceptions: two accounts

This is just the first post in a series to address a very small question that I’m sure we can quickly dispose of.

But here’s the question:

I’m sure the vast majority of you need no further explanation.  But for newbies, this is a “tweet” from “Fearless Dave” Ropeik, the public risk perception expert who correctly believes it is irrational to worry about anything.  Likely you all remember the discussion we recently had about how Fearless Dave had his kids go over & play with the nextdoor neighbors’ children when they had Ebola because he figured it was much better for his kids to get the disease when they were young than when they were grown ups.  Of course—this is the perfect System 2 rationality we all aspire to!

But anyway, what he’s asking is—why do cultural affinities (like being an “egalitarian communitarian” as opposed to a “hierarch individualist”) make such a big difference in perceptions of the risk of climate chanage, or owning a handgun, or nuclear energy?

Fearless Dave doesn’t mean why as in “what are the mechanisms that generate such big disparities in the proportion of people of one type who believe that human beings are heating up the climate & the proportion of another type who believe that?”; he’s quite familiar with (and a very lucid expositor and insightful interpreter of) all manner of work on risk perception, including the research that shows how people of opposing identities conform all manner of information—from their intepretation of data to their assessments of arguments to their perception of the expertise of scientists to what they observe with their own eyes—to the position that predominates in their group.

What he wants to know is why these cognitive mechanisms are connected to group identities.  Why are people so impelled to fit their views to their groups'? And why do the groups disagree so intently?

Is there, Fearless Dave wonders, some sort of genetic hard wiring having to do with the evolutionary advantages, say, that “Democratic” or “nonreligious” cavepeople & “Republican” “religious” cavepeople got from forming opposing estimates of the risk of being eaten by a a sabre tooth tiger on the savannah--and then going to war w/ each other over their disagreement?

Really good question.

I don’t know.

But I and a few others twitterers offered some conjectures:

Now probably this exchange needs no explanation either.

But basically, I and Jay Van Bavel are disagreeing about the reason cultural identities generate conflicting perceptions of risk and like facts.  

Or maybe we aren’t.  It’s hard to say.

While Twitter is obviously the venue most suited for high-quality scholarly interaction, I thought I’d move the site of the exchange over to the CCP Blog--so that you, the 12 billion regular readers of this blog (for some reason 2 billion people unsubscribed after my last post!),  could participate in it too.

Just to get the ball of reasoned discussion rolling, I’m going to sketch out two competing answers to Fearless Dave’s question: the “Tribal Science Epistemologies Thesis” (TSET) and the “Polluted Scicomm Environment Thesis” (PSET). The answers aren't "complete" even on their own terms, but they convey the basics of the positions they stand for and give you a sense of the attitudes behind them too.

TSET. People are by nature factional. They use in-group/out-group distinctions to organize all manner of social experience—familial, residential, educational, occupational, political, recreational (“f***ing Bucky Dent!”).  The ubiquity of this impulse implies the reproductive advantage it must have conferred in our formative prehistory. Its permanence is testified to by the unbroken narrative of violent sectarianism our recorded history comprises.

The mechanisms of cultural cognition reflect our tribal heritage. The apprehension of danger in behavior that deviates from a group’s norms fortifies a group’s cohesion. Imputing danger to behavior characteristic of a competing group’s norms helps to stigmatize that group’s members and thus lower their status.  Cultural cognition thus reliably converts the fears and anxieties of a group’s members into the energy that fuels that group’s drive to dominate its rivals.

In a democratic political order, these dynamics will predictably generate cultural polarization. Opposing positions on societal risks (climate change, gun ownership, badger infestation)  supply conspicuous markers of group differentiation. Democratically enacted policies endorsing o rejecting those positions supply evocative gestures for remarking the relative status of the groups that hold them.

Nothing has really changed.  Nothing ever will. 

PSET. Cultural conflict over risk and related facts is not normal. It is a pathology peculiar to the pluralistic system of knowledge certification that characterizes a liberal democratic society. 

Individuals acquire their understanding of what is known to science primarily through their everyday interactions with others who share their basic outlooks. Those are the people they spend most of their time with, and the ones whose professions of expertise they can most reliably evaluate. Because all self-sustaining cultural groups  include highly informed members and intact processes for transmitting what they know, this admittedly insular process nevertheless tends to generate rapid societal convergence on the best available evidence.  

But not always. The sheer number of diverse groups that inhabit a pluralistic liberal society, combined with the tremendous volume of scientific knowledge such a society is distinctively suited to generating, makes occasional states of disagreement inevitable.

Even these rare instances of nonconvergence are likely to be fleeting.

But if by some combination of accident, misadventure, and strategic behavior, opposing perceptions of risk become entangled in antagonistic cultural meanings, dissensus is likely to endure and feed on itself. The material advantage any individual acquires by maintaining her standing within her cultural group tends to exceed the advantage of holding personal beliefs in line with the best evidence on societal risks. As a result, when people come to regard  positions on risk as badges of membership in one or another group, they will predictably use their reason to persist in beliefs that express their cultural identities.

This identity-protective variant of cultural cognition is the signature of a polluted science communication environment.  The entanglement of risks in antagonistic cultural meanings disables human reason and deprives the citizens of the Liberal Republic of Science of their political regime’s signature benefits: historically unprecedented civil tranquility and a stock of collective knowledge bountiful enough to secure their well-being from all manner of threat, natural and man-made.

But we can use our reason and our freedom to overcome this threat to our reason and our freedom.  Dispelling the toxin of antagonistic cultural meanings from our science communication environment is the aim of the science of science communication—a “new political science for a world itself quite new.”

So? Which is closer to the truth—TSET or PSET? 

What are the key points of disagreement between them? What might we already know that helps us to resolve these disagreements, and what sorts of evidence might we gather to become even more confident?

What are the alternatives to both TSET and PSET? Why might we think they are closer to the truth? How could we pursue that possiblity through observation, measurement, and inference?

And what does each of the candidate accounts of why “group affiliation” has such a profound impact on our perception of risk and like facts imply about the prospects for overcoming the barrier that cultural polarization poses to making effective use of scientific knowledge to promote our ends, individual and collective?

BTW, why do I say "closer to the truth" rather than "true"? Because obviously neither TSET nor PSET is true, nor is any other useful answer anyone will ever be able to give to Fearless Dave's question. The question isn't worth responding to unless the person asking means, "what's a good-enough model of what's going on--one that gives me more traction than the alternatives in explaining, predicting, and managing things?"

So ... what's the answer to Fearless Dave's question? Do TSET & PSET help to formulate one?

PrintView Printer Friendly Version

EmailEmail Article to Friend

Reader Comments (11)

What percentage of scientists are climatologists?

November 29, 2014 | Unregistered CommenterSteve Sailer

"What percentage of scientists are climatologists?"

I don't know, but when Doran and Zimmerman polled "Earth Scientists", 3146 responded to the survey, of who 79 were judged to be climate scientists. That would be 2.5%. Of course, "Earth scientists" doesn't cover all scientists, and the sampling is very likely to be biased towards those with an interest in climate issues, given the questions it asks.

http://tigger.uic.edu/~pdoran/012009_Doran_final.pdf

"The apprehension of danger in behavior that deviates from a group’s norms fortifies a group’s cohesion."

Humans are social animals. This is quite tricky to do - living in close proximity, humans have to constrain their behaviour to avoid stepping on the toes of their neighbours, they have to allocate resources with minimal conflict, and they have to communicate their plans in order to work together in cooperation, or at least not in conflict.

Humans have two instinctive behaviours that enable them to do this: language and morality. Any social animal must have some version of these. Ants communicate by chemical messages and follow strict rules. Wolves have a variety of display behaviours to indicate dominance or submission, and follow pack rules. Even chickens have a pecking order. But humans take both language and morality to another level by making theirs adaptive. Having a language is instinctive, but the details of vocabulary and usage can vary. They're agreed collectively, as people pick up words from other people, and search for innovative ways to communicate new thoughts. The mapping from sound to meaning is arbitrary, but that does not mean any single individual can make it all up. To communicate with their neighbours, they have to mostly conform to the group conventions. Similarly with morality - the social rules on behaviour are agreed collectively, people pick up rules from other people, either by watching how they behave with one another or by their response when you conform to or cross the boundaries. And when people engage in a new social activity they quickly develop a new etiquette of behaviour for it. The internet is a fascinating study for this.

Both language and morality are systems designed for a single uniform society - a tribe - where the norm is that everyone agrees more or less on them. Those who don't conform are soon pushed back into line, if they don't move back themselves as soon as they realise they're standing out. Tribal conflict is therefore like the communication problems you get when two peoples speaking different languages come together. Neither has a clue what the others are saying, or what they're going to do. Cooperation becomes near impossible - humans in this situation have to fall back on their common animal heritage of gesture and mime to communicate, and it's crude, slow, and rife with misunderstanding. Likewise with culture - people from each society finding themselves crossing the uncrossable boundaries of the other society. The violent conflict that naturally arises between animals packed together in close proximity re-arises, combined with the additional mechanisms for enforcing social conformity - all the unpleasant stuff that the moral system evolved to prevent.

But humans are complicated, and can exist in multiple overlapping groups at the same time, each with their own language and rules. At the simplest level, people behave differently with their close family, their next-door-neighbours, and with random strangers within their society. Teenagers often form their own group with language and rules different from adult society. Men and women operate different rules between themselves. And people operate by different rules between work and leisure activities. However, in most of these cases they know both sets of rules, and both languages, and can switch from one to the other depending on who they're with. Usually, overlapping groups living together develop their own intra-cultural inter-group etiquette governing which set of rules are in effect at any given time.

The rules are developed through interaction - they start to diverge when groups are isolated from one another - when people don't talk, don't meet, don't socialise or work together often enough for these minor conflicts to arise and get settled,

" The material advantage any individual acquires by maintaining her standing within her cultural groups tends to exceed the advantage of holding personal beliefs in line with the best evidence on societal risks."

That's the wrong way of thinking about it. Following the cultural group's opinion on who has the greatest expertise is an individual's best estimate of what is the best evidence on societal risk. "Best evidence" isn't an objective standard knowable by all, as most people don't have the expertise to interpret it. Or the time to dig it out and comprehend it. So they take other people's word for it. The cultural groups are not what gets in the way of this - they are what actually does this. When you take other people's word for it, you follow the herd. You are always assuming that somebody else somewhere in the herd knows where you're supposed to be going.

We only notice this effect when we get two herds going in opposite directions. We ask ourselves: "Hey! Why are all those people over there going that way?! They're going in the wrong direction!"

Are they stupid? Are their leaders maliciously misleading them in the wrong direction? Is the problem that the individuals in it think sticking with the herd is more important to them than going in what they must surely know is the right direction? Do you suppose all the creatures in that herd are thinking "I know this is the wrong way, but I must stick with the herd!"?

Or could it be that they are just as puzzled by you, and that in fact everybody is just following the herd, nobody knows where they're going, and it's just random. And you only notice this when the herd splits. Maybe on a lot of the issues where there is no split, we're just as misled? Is it just that because everybody is going in the same direction, that we believe we must therefore have all converged on the right direction? Why do we think so?

Maybe one herd does know where it's going, and the other doesn't. But as a herd member, how could you tell which was which? As a herd-follower - one who rejects 'nullius in verba' as an impossible and impractical standard for any individual - how could you even ask the question?

November 29, 2014 | Unregistered CommenterNiV

What about the fact that the "styles" of cognition active on a cultural level are paralleled at the individual level? I mean that many of the things we do to protect cultural identity (motivated reasoning, confirmation bias...) we also do to protect individual opinions that needn't have any cultural connection. If you try to categorize the root of cultural cognition, are you implying that "individual cognition" flows from the sample place? Or that it converges on similar behaviors, but from a different root? (And there are more possibilities, including that cultural cognition flows from "individual cognition".)

I mean, sure, an issue has to become "polluted" for differences to become large in a population, amplified by cultural meanings. But issues don't have to be "polluted" for us to engage in defensive cognitive actions. It feels like there's something shared between these contexts that isn't being accounted for in your TSET/PSET answers. Am I making any sense here?

November 29, 2014 | Unregistered CommenterScott Johnson

"I don't know, but when Doran and Zimmerman polled "Earth Scientists", 3146 responded to the survey, of who 79 were judged to be climate scientists. That would be 2.5%."

So climate scientists are 1/40th of Earth scientists, who are 1/? of all scientists. In other words, climate scientists are likely less than 1% of all scientists.

So why the equation of climatology with Science with a capital S?

November 30, 2014 | Unregistered CommenterSteve Sailer

@NiV

That's the wrong way of thinking about it.

I 63.25% agree with this.

I think it is a very sensible "strategy" to canvass the population for evidence of what's known -- that one has to do something like this b/c of the limits on any individual's capacity to comprehend all that's known on his or her own, especially as it continues to multiply in volume & consequence.

I think more than "following the herd" is involved. Being a good canvasser involves both sensible sampling -- knowing whose lead to follow, knowing who is just a fool & the like -- & a fine-tuned apparatus for extracting sense from the data.

I think in fact this is not really "following" at all but rather reasoning of a very sophisticated, highly developed kind. And it fills me with awe.

But none of this gets to the 36.75% really.

That all goes to the "two herds" part. There I think you are just sort of waiving your hands, when it in fact this is what we must get hold of.

If one were simple-minded about this-- often it is helfpul to start that way--one could devise an agent-based "learning" model in which groups w/ insular "search" strategies routinely end up polarized on "what's known." that would be much like what you are describing.

But this would show how much more complicated what we are dealing with is.

Because there are almost never states of divergence like this; once one includes the entire class of knolwedge that must be acquired and transmitted this way, it becomes obvious that the states of polarization are a near microscopic proportion of the total!

The dynamics that lock nondivergence in place, moreover, consist of a kind of super-energized bond of group hatred -- not a kind of robotic state of myopic copying.

If you think *rationality* is not involved at that point, in fact, then you are leaving far too much of the data lying unexplained on the table. this is not following anything, not trying to acquire knowledege of the facts in any way whatsoever. It is the deployment of critical reasoning to achieve what is obviously a valued end of congruence between one's own views and ones group on an issue that the resolution of which is at the center of a status competition with a rival group.

I think if you believe the first part of your account -- of the way in which affinities figure in participating in collective knowledge -- is not a false start, then TEST is already on its heels.

But if you don't have an explanation other than the instinct to master other groups for the aggressive deployment of reason to hold these states of noncoverngence in place, then TEST regains its footing & delivers a staggering blow to PSET

November 30, 2014 | Registered CommenterDan Kahan

@ScottJohnson:

Yes, I think there's a lot of confusion in these sketches that needs to get weeded out; the question is whether when the weeding is done, there'll be anything but a bare patch of earth left...

But is it possible to fix the one you are identifying by recognizing that of course identity-protective cognition is just one species of motivated reasoning, which is itself just one of a large variety of cognitive dynamics that track information processing in patterns that unevenly match up with truth seeking?

In that way, identity-protective reasoning could figure in TEST and PSET w/o either having to rest on the claim that there won't be many other recurring occasions in which forms of motivated reasoning & cognate dynamics arise.

November 30, 2014 | Registered CommenterDan Kahan

One way to get ideas across ideological tribal lines is to recruit testimonials from individuals jobs or hobbies very redolent of one tribe. Much of the problem with getting people to believe in climate change is that people statistically good reasons for rejecting the evidence they've experienced of climate change over their lifetimes as being too mixed and too small of a sample size to put much credence in. Ask yourself: are summers hotter now than when you were young? Many people would rightly respond that they can't say, that estimating averages out of such noisy data is too hard for them to do.

On the other hand, some people have jobs where they have to pay close attention to things like how early do lakes freeze over, has the tree line been changing, have herds been migrating more north or south than in the past. For example, a veteran professional big game hunting guide in Alaska would be a credible source on climate change to many Red Staters. He has an extreme Red State job, and he has to pay careful attention to climate related data points in the real world.

November 30, 2014 | Unregistered CommenterSteve Sailer

@SteveSailer:

How about this?

November 30, 2014 | Registered CommenterDan Kahan

"The dynamics that lock nondivergence in place, moreover, consist of a kind of super-energized bond of group hatred -- not a kind of robotic state of myopic copying."

You're right. Myopic copying would almost certainly end up with one group opinion. It's like throwing a bunch of magnets into a box and shaking it. They all wind up in one lump, moving together.

What I was thinking of was the multiple independent herd you get when the herds are not in communication. One herd forms on this side of the hill, and another herd forms on the other. Each sees only one herd to follow. Then later on, when one of them moves randomly around the hill, that's when they're puzzled at the other herds apparently random motions. And if you have two isolated social networks that don't ever talk to one another, this could arise. But liberals and conservatives do talk to one another, so this won't work as it stands.

Where the herd-following model doesn't work is when you've got people who don't follow the herd mixed in. People who latch on to something else - evidence, ideology, corrupt self-interest, whatever - and won't move from it, whatever the other members of their tribe do or say. If you glue one magnet to the bottom of the box and throw the others in, they will cluster around that one's position. Glue two magnets to different parts of the box, and you'll get clusters around each. To get consistent polarisation along political lines, you have to get herd-followers that preferentially follow their own political herd, and independent minds in both parties.

Thus, to get persistent polarisation (in this model), you need people who do the opposite of what you said. People for who "the advantage of holding personal beliefs in line with the best evidence on societal risks" tends to exceed "The material advantage any individual acquires by maintaining her standing within her cultural groups".

For the herd followers, the vast majority, their best estimate of societal risks *is* the herd. The herd *leaders* are the ones who *don't* follow the herd. So if you want to understand the polarisation, you have to understand the opinion leaders. You can't do it with mass surveys of the general population, most of who don't think that way.

Well, it's one possible model.

December 1, 2014 | Unregistered CommenterNiV

"Ask yourself: are summers hotter now than when you were young? Many people would rightly respond that they can't say, that estimating averages out of such noisy data is too hard for them to do."

Oddly enough, I've found the opposite. A while back there was a tactic of highlighting general shifts in local climate as a sign of global warming, and because the weather does change on a decadal timescale, there are indeed such shifts for people to notice. The idea is that personal observation is a lot more convincing than abstract statistics, and in my own experience talking to people I've found a number of them cited it to me as personally convincing evidence.

But it's not a new phenomenon. I'll give you an example:

"A change in our climate however is taking place very sensibly. Both heats and colds are become much more moderate within the memory even of the middle-aged. Snows are less frequent and less deep. They do not often lie, below the mountains, more than one, two, or three days, and very rarely a week. They are remembered to have been formerly frequent, deep, and of long continuance. The elderly inform me the earth used to be covered with snow about three months in every year. The rivers, which then seldom failed to freeze over in the course of the winter, scarcely ever do so now. This change has produced an unfortunate fluctuation between heat and cold, in the spring of the year, which is very fatal to fruits. From the year 1741 to 1769, an interval of twenty-eight years, there was no instance of fruit killed by the frost in the neighbourhood of Monticello. An intense cold, produced by constant snows, kept the buds locked up till the sun could obtain, in the spring of the year, so fixed an ascendancy as to dissolve those snows, and protect the buds, during their development, from every danger of returning cold. The accumulated snows of the winter remaining to be dissolved all together in the spring, produced those over flowings of our rivers, so frequent then, and so rare now."

That was Thomas Jefferson, writing in 1804 - 'Notes on the State of Virginia'. :-)

Unfortunately for the claim, the science is that such fluctuations do happen anyway, and go both ways. (One sceptic amused himself by regularly publishing local temperature records showing a long-term temperature decline in response to the stories. In about 1/3rd of places the temperature has fallen and in about 2/3rds it has risen, so such places are not hard to find.) You can't detect global climate change on a local scale - or at least, not yet. You have to look at the averages over at least continental scales and over many decades to detect a consistent movement in the same direction. And of course there are arguments about whether even this rises above the long-term natural background variation.

So yes, it's an effective method of persuasion, but it's against the science. It depends on what a scientist thinks "the right balance is between being effective and being honest".

December 1, 2014 | Unregistered CommenterNiV

(submitted a fewe days ago, not sure why it didn't post)
In case any of The 12 Billion are wondering, my kids are fine. Neighbors’ kids are all dead though. Sad.

And in case any of you are wondering, I try to be careful about the word irrational. It is entirely rational to worry about lots of things…even when we worry about some too much and some too little. Dangerous, certainly, but ‘rational’, in the sense that it makes common sense to try to survive using all the tools we have, including our affective/subjective/instinctive system for making sense of what FEELS risky and what doesn’t. The problem with the word ‘irrational’, to me, is the implication that there is a right and wrong, and the objective fact-based analysis way is the right/rational way, and irrational = dumb. I just got tangled in a snit with Nicholas Taleb about a GMO paper he wrote, (called it Advocacy Masquerading as Rational Argument https://medium.com/@dropeik.com/on-taleb-et-al-s-the-precautionary-principle-with-application-to-the-genetic-modification-of-dba21ccf94aa prompting him to ask for my definition of ‘rationality’, so I posted this, in case anyone is interested: The Risk Response. some Thoughts on The Limits to Reason.
https://medium.com/@dropeik.com/the-risk-response-some-thoughts-on-the-limits-to-reason-5938ed6c47d9)

Anyway…on the central question. Forgive me for ducking the question as posed; PSET or TEST or, if neither, then what explains why cultural identities generate conflicting views on risk? Good question, but not what I’m wondering about.
What I’m wondering about is what research anyone can point me to that addresses the larger supposition I have about WHYour group/tribal/cultural affiliations matter so much. An obvious assumption for this non-scholar is that it must be adaptive, right? We evolved as social animals, heavily dependent on the group for our safety and survival. So given the power of the survival imperative, we have a fundamental need to find group affiliations wherever we can, and cognition must be pretty actively reading cues from the world in terms of what those cues portend in terms of those groups.
We affiliate with lots of groups, many shaped by Cultural Cognition parameters, but hardly all. Women are a group, so things get interpreted through that lens. (What WAS that astronomer thinking wearing that tacky shirt?) Race is a group identifier (huh, Ferguson!) Geography is certainly tribal, so as we say in Boston, “Yankees SUCK!” (Did you see the sign at the Harvard-Yale Football game held up by a Harvard fan? “Yale
Students Use Wikipedia!”) Environmentalists are a group, so things get interpreted through that lens (GMOs, vaccines). And on and on. Regardless of the group and the characteristics that identify membership, our perception generally is filtered through the lenses of cultural/group affiliation. It’s the lens through which we judge “What’s the other group up to” (TEST)…”What do the messages imply” (PSET)…and any other cues that have implications for whether we’ll be seen as a member in good standing of our group, and whether our group is cohesive and strong in combat with others.

So essentially, any stimulus is interpreted by “What does this mean for My Tribe, ergo, My Safety?” Can there be an overarching predictive Theory of Group Affiliation…why do we associate with which groups? That seems pretty complex and anyway is way beyond my pay grade. Can we have an overarching Theory that predicts that, as the group vies go, so go ours. That’s what Cult Cog is, right? Only I just think there are more groups that The Big Four.
But I think it all leads to the same conclusion, and value, that Cult Cog and the science of science communication suggest; by understanding the group affiliations that are relevant to how people see things, we can frame policies and information in ways that are culturally/group congenial, and therefore minimize potential group identity conflicts poorly framed messages and policies might trigger, which would help ‘make effective use of scientific knowledge to promote our ends.’

December 2, 2014 | Unregistered CommenterDavid Ropeik

PostPost a New Comment

Enter your information below to add a new comment.

My response is on my own website »
Author Email (optional):
Author URL (optional):
Post:
 
Some HTML allowed: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <code> <em> <i> <strike> <strong>