follow CCP

Recent blog entries
popular papers

What Is the "Science of Science Communication"?

Climate-Science Communication and the Measurement Problem

Ideology, Motivated Cognition, and Cognitive Reflection: An Experimental Study

'Ideology' or 'Situation Sense'? An Experimental Investigation of Motivated Reasoning and Professional Judgment

A Risky Science Communication Environment for Vaccines

Motivated Numeracy and Enlightened Self-Government

Ideology, Motivated Cognition, and Cognitive Reflection: An Experimental Study

Making Climate Science Communication Evidence-based—All the Way Down 

Neutral Principles, Motivated Cognition, and Some Problems for Constitutional Law 

Cultural Cognition of Scientific Consensus
 

The Tragedy of the Risk-Perception Commons: Science Literacy and Climate Change

"They Saw a Protest": Cognitive Illiberalism and the Speech-Conduct Distinction 

Geoengineering and the Science Communication Environment: a Cross-Cultural Experiment

Fixing the Communications Failure

Why We Are Poles Apart on Climate Change

The Cognitively Illiberal State 

Who Fears the HPV Vaccine, Who Doesn't, and Why? An Experimental Study

Cultural Cognition of the Risks and Benefits of Nanotechnology

Whose Eyes Are You Going to Believe? An Empirical Examination of Scott v. Harris

Cultural Cognition and Public Policy

Culture, Cognition, and Consent: Who Perceives What, and Why, in "Acquaintance Rape" Cases

Culture and Identity-Protective Cognition: Explaining the White Male Effect

Fear of Democracy: A Cultural Evaluation of Sunstein on Risk

Cultural Cognition as a Conception of the Cultural Theory of Risk

« Informed civic engagement and the cognitive climate | Main | Check out Jen Briselli's cool pilot study of cultural cognition of vaccine risk perceptions »
Sunday
Mar032013

How common is it to notice & worry about the influence of cultural cognition on what one knows? If one is worried, what should one do?

Via Irina Tcherednichenko, I encountered a joltingly self-reflective post by Lena Levin on G+:

Just yesterday, I successfully stopped myself from telling a person that their expressed belief has not a shred of evidence to support it (just in case, it wasn't a religious belief, that was something that could be demonstrated scientifically, but hasn't been). I stopped myself (pat on the head goes here) because, for one thing, I knew it would lead nowhere; and for another, I have my share of beliefs with a similar status of not being supported by scientific evidence (but not disproved by it either).

Just like anyone else beyond the age of five or ten, I have a worldview, my own particular blend of education, research, life experiences, internalized beliefs, etc. And by now, this worldview isn't easy to shake, let alone change. It doesn't mean that I disregard new scientific evidence, but it does mean that whenever I hear of new findings that seem to be in explicit contradiction with my worldview, I make a point of finding the source and reading it in some detail (going to a university library if need be). In 99 cases out 100 (at least), it turns out that I don't have to change my worldview after all: sometimes the apparent contradiction results from BBC-style popularization with a healthy doze of exaggeration or downright mistakes on a slow news day, sometimes the original research arrives at some almost statistically insignificant result based on far too small a sample, prettified it to make it publishable, or something else, or both.

But the dangerous thing is, if a reported finding does agree with my worldview, I usually don't go to such lengths to check the original source and the quality of research (with few exceptions). There is, of course, a certain degree of confirmation bias at work here, but my time on this earth is limited and I cannot spend it all in checking and re-checking what is already part of my worldview. What I do try to avoid in such cases is the very tempting assumption that now, finally, this particular belief is a knowledge based on scientific evidence (unless I really checked it at least with the same rigor as described above). I am afraid I am not always successful in this... are you?

I thought others might enjoy reflecting/self-reflecting on this sort of self-reflection too.

Here are my questions (feel free to add & answer others):  

1.  What fraction of people are likely to be this self-reflective about how they know what they know?

2.  Would things be better if in fact it were more common for people to reflect on the relationship between who they are & what know, on how this might lead them to error, and on how it might create conflict between people of different outlooks? If so, how might such reflection be promoted (say, through education, or forms of civic engagement)?

3.  Okay: what is the answer to the question that Levin is posing (I understand her to be asking not merely whether others who use her strategy think they are successful with it but also whether that strategy is likely to be effective in general & whether there are others that might work better)? What should a person who knows about this do to adjust to the likely tendency to engage in biased search (& assimilation) consistent w/ worldview.

PrintView Printer Friendly Version

EmailEmail Article to Friend

Reader Comments (15)

To help nip this problem 'in the bud', so to speak, I have found it useful to consciously keep my beliefs to an absolute minimum. In this regard, I have to thank the philosopher Immanuel Kant and his book, Critique of Pure Reason. It is actually possible to navigate reality without having lots of fundamental beliefs about it.

After that, simply remaining aware of what assumptions you make in order to reach a conclusion helps a lot. A conclusion is only as good as the assumptions it rests upon, after all. If you're fond of robust conclusions, there are real benefits to using this rule of thumb: you can remain flexible in your world-view as new data come to light, but at the same time enjoy a measure of confidence that you're relying on only so much as your situation warrants.

March 3, 2013 | Unregistered CommenterEric Baumholder

@Eric:

I'm curious to hear more about what it mean to keep beliefs to minimum. I could imagine a version of this that I'm sure is a caricature: someone who stubbornly remains agnostic about all manner of empirical claims ("do antibiotics effectively fight bacterial infections?" "does smoking cause cancer?" "do cellphone towers cause brain tumors?" etc.) & who as a result ends up frozen in place or eaten by a lion or whathaveyou.

Why isn't Rev. Bayes a more helpful guide here than Herr Prof. Dr. Kant? Believe whatever position is supported by your current assessment of the currently best available evidence but readily update based on new information that has a likelihood ratio different from 1? If something of consequence turns on acting on my belief, I will credit my belief, recognizing that it is nothing more than a provisional assessment of this sort but that it is also the best I can do for the moment.

I think that way. But then Levin comes along & makes me see that an influence extraneous to the correctness of my belief -- viz., my worldview -- might be biasing me away from exposure to information that challenges my existing belief & biasing my assessment of the likelihood ratio -- the weight -- to assign any new information that I happen to be exposed to.

Is there some sort of meta-Bayesian response? Discount everything I believe by some cultural cognition bias factor?

Or maybe there is a strategy one who appreciates Kant's insights follows that I should know about. I'm sure I'm wrong to read you to be saying that Kant proposes we suspend belief in circumstances in which "what to do" depends on having one!

March 3, 2013 | Unregistered Commenterdmk38

I've been thinking about this a lot lately. In fact, I could have made a post very similar to Levin that echoes my time spent over the last several months immersed in the cultural cognition research and working on my thesis project in attempt to put that understanding to use to help improve communication strategies. One of the unexpected (but in hindsight completely reasonable) effects of learning more about cultural cognition has been that a) I myself have tended to think more deeply about my own tendencies, and to think a little more reflectively about my own communication and beliefs, as well as others and why they develop the anxieties and beliefs they do based on worldview.. (I joke with friends that this year has curbed any misanthropic tendencies I used to have.. ) and b) I have also started notice, when I chat about this work casually with others, the common replies of similar reflection- people wonder "where do I fall on that group/grid diagram?" and "wow, I've been thinking about things like ___ for so long.. no wonder when we try to explain something this way, those people react that way" and so on.

So, you ask:
"1. What fraction of people are likely to be this self-reflective about how they know what they know?"

Good question! Probably not many. But, maybe more than we'd expect? I'm finding a surprising number of people I talk to begin to do so, prompted only by casual conversation and some bit of education about CC. And, maybe the people for whom this reflection would be most powerful (scientists/communicators with an interest in communicating?) are already predisposed to be more reflective as individuals? (100% conjecture). Something definitely worth exploring.

"2. Would things be better if in fact it were more common for people to reflect on the relationship between who they are & what know, on how this might lead them to error, and on how it might create conflict between people of different outlooks? If so, how might such reflection be promoted (say, through education, or forms of civic engagement)?"

In a nutshell, yes! I was a teacher for several years before returning to graduate school to study information/interaction design, and the one aspect of my past life in education that has remained most relevant throughout has been the importance of encouraging reflection. Whether through formal education or the design of information or experiences, finding ways to prompt reflection are the most challenging and yet most effective strategies to promote change (whether in attitudes or behavior or just understanding). I've spent time both as a teacher and as a user-experience designer in software asking the same questions- how can we promote reflection and help people construct an understanding of ____ that allows them to actually apply said understanding? If you're not familiar, read a little about "cognitive apprenticeship" (wikipedia should do), as a constructivist approach to education.

In my own project, which, granted, spans only a single school year and so is limited in scope, I'm beginning to ask this question specifically for the case of science communicators and scientists themselves- would teaching them explicitly about cultural cognition help them better design their own communication to reach people "across the aisle" (or perhaps the better term would be "across the group/grid axes."

There are some good studies and papers I've found from folks working on this- how to make some actionable guidelines for communicators based on our ever improving understanding of risk perception, but my gut (and tiny bit of fairly qualitative research) is telling me it may be an even better idea to teach these folks about cultural cognition itself - with prompts to reflect on one's own worldview- better enabling them to consider the contrast between their own worldview and others and learn to frame information differently to reach them all.

I'll talk a little more about this in a forthcoming email reply, but this post was rather timely as I'm beginning to plan the structure of a workshop that I've been asked to present- and I'm thinking a lot about how a cognitive apprenticeship model might look when we take it out of a traditional classroom and apply it to something like the field of cultural cognition-informed (I've been saying 'values based' or a 'rhetorcial approach' to other folks) science communication. I'm hoping to find out if teaching people a little about cultural cognition- along with some communication design strategies that make it easier to talk to people outside your own worldview without alienating/threatning their values- will be useful for the group of science students and researchers that want to attend this workshop.

March 3, 2013 | Unregistered CommenterJen

Everybody has their own blindspots, and while being aware of them helps, and developing critical thinking techniques to test and challenge them is useful, it isn't generally sufficient. Despite knowing they exist, we still can't see them directly.

The answer is to observe that different people have different blindspots, so by talking to people with different worldviews, who disagree with our opinions and are strongly motivated to find fault with them, we can use their eyes as a mirror to see into our own blindspots, as we help them to see into theirs.

That is why science is sceptical, why everything in it is subject to challenge, why constructive criticism is seen as helpful, why scientists constantly go back and re-examine even the most solid foundations. We publish not for adulation and advancement, but precisely in order to seek out challengers, who will try to replicate or refute our conclusions, and so detect our errors. 'Peer review' is not what the journal does, but what the community of readers and responders and commentators do.

Dogma is a scientific sin. And scientific heresy - if you can back it up with solid (or at least plausible) arguments - is its highest virtue.

Thus, we seek out those who disagree with us, and we converse with them. We argue with them, and try to win them over. But that's not the real aim, which is to test and strengthen our *own* beliefs, by detecting and discarding our errors, and by developing other explanations and analogies that deepen our understanding. Opposition is a valuable scientific resource to be encouraged and exploited.

The failure of motivated reasoning is to see opposition not as a resource to improve ourselves but as a threat to our existing perfection. Our priors, being too interconnected to who we are, are too certain and effective challenges lead instead to cognitive dissonance. We become defensive, and attempt to *prevent* our ideas being subjected to criticism. We say instead: "Why should I make the data available to you, when your aim is to try and find something wrong with it"?

A one-sided dogma is a dead belief, and it soon starts to rot...

JS Mill again:

"He who knows only his own side of the case, knows little of that. His reasons may be good, and no one may have been able to refute them. But if he is equally unable to refute the reasons on the opposite side; if he does not so much as know what they are, he has no ground for preferring either opinion. The rational position for him would be suspension of judgment, and unless he contents himself with that, he is either led by authority, or adopts, like the generality of the world, the side to which he feels most inclination.

Nor is it enough that he should hear the arguments of adversaries from his own teachers, presented as they state them, and accompanied by what they offer as refutations. That is not the way to do justice to the arguments, or bring them into real contact with his own mind. He must be able to hear them from persons who actually believe them; who defend them in earnest, and do their very utmost for them. He must know them in their most plausible and persuasive form; he must feel the whole force of the difficulty which the true view of the subject has to encounter and dispose of; else he will never really possess himself of the portion of truth which meets and removes that difficulty.

Ninety-nine in a hundred of what are called educated men are in this condition; even of those who can argue fluently for their opinions. Their conclusion may be true, but it might be false for anything they know: they have never thrown themselves into the mental position of those who think differently from them, and considered what such persons may have to say; and consequently they do not, in any proper sense of the word, know the doctrine which they themselves profess. They do not know those parts of it which explain and justify the remainder; the considerations which show that a fact which seemingly conflicts with another is reconcilable with it, or that, of two apparently strong reasons, one and not the other ought to be preferred. All that part of the truth which turns the scale, and decides the judgment of a completely informed mind, they are strangers to; nor is it ever really known, but to those who have attended equally and impartially to both sides, and endeavoured to see the reasons of both in the strongest light.

So essential is this discipline to a real understanding of moral and human subjects, that if opponents of all important truths do not exist, it is indispensable to imagine them, and supply them with the strongest arguments which the most skilful devil's advocate can conjure up."

March 3, 2013 | Unregistered CommenterNiV

@NiV: In the spirit of your answer & mindful of the difficulties raised by the Levin anxiety, one thing that can help is to try to find people who you believe share your worldview but who disagree w/ the view that prevails within it & ask them why they believe what they do. But I also don't think that is sufficient; there's something very distressing in particular about thinking that one has been shut out of insight that might reside in a place doesn't venture.

@Jen: At some point, I must blog on the Boston Science's Museum's wonderful "provocative questions" exhibit -- which tried to find out whether making people reflect on how factual beliefs & values cohere made people more reflective & open-minded!

March 3, 2013 | Unregistered Commenterdmk38

What fraction of people are likely to be this self-reflective about how they know what they know?

I think relatively small - particularly in proportion to how strongly people feel about issues.

Would things be better if in fact it were more common for people to reflect on the relationship between who they are & what know, on how this might lead them to error, and on how it might create conflict between people of different outlooks?

IMO, of course. I don't see why this is even a question.

If so, how might such reflection be promoted (say, through education, or forms of civic engagement)?

Sorry - rant time once again...

I think that explicit education on the phenomenon of motivated reasoning is pretty much a prerequisite, and then based on such a foundation, carefully designed processes of civic engagement provides crucial context and feedback from the learning that takes place as someone tries to apply theory.

Based on my experiences in education, I have a theory that as with other aspects of developmental psychology, there is a "critical period" of development where such explicit instruction has the potential to be most effective. I think that period might be around the time when students move from a more egocentric and subjective educational paradigm that is common in high school to an ostensibly more "objective" or "academic" paradigm employed (in an ideal world but less so in reality) as students (on average - it obviously will be different for different students) move into undergraduate school. This timeline fits with basic components of developmental psychology.

I will recommend a book that I think is useful for understanding how to help educate students along those lines:

http://www.amazon.com/Clueless-Academe-Schooling-Obscures-Life/dp/0300105142

To summarize what I'm going for, students should learn the skills of critical analysis through experimenting in authentic critical dialog with themselves. Our schools teach students to adopt the arguments of others more than they teach students to construct their own arguments. This builds an academic alienation from critical analysis. (It doesn't only happen in schools - these characteristics are also embedded in American society more generally as seen in a basic anti-intellectualism, or in a more hierarchical view of intellectual development as seen in some other cultures).

This is related to NiV's conceptualization, but it is a modification: Entering discussions with others only helps if the developmental groundwork for learning how to engage in critical analysis has been laid; those developmental skills are not developed in isolation - dialog with those of differing views is a useful and important component - but it is how you use that dialog that is key more so than the fact of the dialog. The developmental process needs external input, but it is primarily an internal development. I would turn around NiV's necessary but not sufficient directionality.

Unfortunately, our educational model largely functions on an antithetical paradigm - at least, typically, until students reach graduate school,and often even their, beyond, as intellectual pursuits confined due to a necessity of establishing professional or academic identity.

3. Okay: what is the answer to the question that Levin is posing (I understand her to be asking not merely whether others who use her strategy think they are successful with it but also whether that strategy is likely to be effective in general & whether there are others that might work better)? What should a person who knows about this do to adjust to the likely tendency to engage in biased search (& assimilation) consistent w/ worldview.

As a general strategy, I think that what Levin employs is very useful one. I see it as being related to some other strategies. (1) as a pre-condition to thinking you've proven a thesis, check to see whether or not you have fully explored all available counter-arguments, (2) present those counterarguments to the scrutiny of a "naysayer," and confirm that you have represented them accurately and comprehensively.. That "naysayer" may be someone outside yourself - but ideally, if you have the skills and commitment to engage in critical analysis, it doesn't have to be (as long as you have done due diligence in your research (3) focus on rebutting those counterarguments - and very importantly - make sure that those rebuttals are fully consistent with your original arguments.

I will refer to a pattern that I have seen play out with students as they attempt to write an analytical essay: They start out with an argument that they think is true (based on bias rooted in their "motivating" legacy of belief and identification structures) and then try to structure their supporting argument. As they do so, they conduct their research to learn about varying perspectives and they filter and sort the evidence to write a conclusion that is consistent with the thesis they started with. What they fail to do is take in all the information they discover, and then go back to restructure their thesis. The reality is that a starting thesis is very rarely valid - and the main problem develops as a resistance to re-formulating the thesis begins to set in. This happens for many reasons; for example, it might be that the student isn't sufficiently intellectually engaged with the task, and all she is really trying to do is complete an assignment. Or it might be because he is wedded to his original thesis for "motivated" (e.g., personal identification with a thesis) reasons. Or it may be because the evidence leads her to ambiguity - and she is locked into a binary restriction that either one of two perspectives must be true.

One thing that I am struck with, when looking at the characteristics of these debates, is the ubiquity of a scorched earth approach, or a zero sum gain mindset. I think it is important to reverse engineer from those phenomena, and to understand not only what compels people towards such an approach (motivated reasoning), but also the mechanisms in play that get them there. Those mechanisms reflect 'habits of mind." Habits are learned behaviors.

March 4, 2013 | Unregistered CommenterJoshua

BTW - Dan,

I thought this clip relates in an interesting way to motivated reasoning:

http://www.youtube.com/watch?feature=player_embedded&v=QPKKQnijnsM

What forces are in play that lead to so much misconception about reality?

March 4, 2013 | Unregistered CommenterJoshua

Another tangential connection for me. I hope you don't mind me cluttering up the comments with these kinds of rambling associations:

W/r/t previous discussions here on the relevance of "selling" climate change perspectives and the connections between selling and persuasion- an interesting interview:

https://soundcloud.com/whyy-public-media/to-sell-is-human

In particular, I am struck by the conversation about the persuasiveness of "mimicking," - which I consider to be a parallel to being able to articulate the "naysayer' argument as I spoke of above. I am also struck with the related discussion of the importance of "listening." The discussion of the counter-productivity of deconstruction (a "habit of mind" particularly comfortable for scientists) as a persuasive technique is also pertinent, IMO. And of course, I think that the discussion of "getting to yes" and what is known about effective conflict resolution is very important to mitigating the biases of motivated reasoning.

Finally, the discussion of the inverse relationship between "power" and ability to understand the perspective of others (and consequently, the ability to persuade) - is, IMO, very pertinent to debates such as what we see w/r/t climate change. "Authority" connotes power. Whether or not that perception of "power" in the climate debate is accurate - it is undeniable that "skeptics" view themselves as those w/o "power" (when I'm not being generous, I refer to that as a need to see themselves as being "victims").

This means that it is incumbent on science communicators to address that perception of power, explicitly. Directly. If science communicators spend time arguing about which perception of power is accurate, they will not address the reality of what "skeptics" perceive. "Skeptics" will never be convinced that they aren't w/o power.

And in that sense - even if it isn't true that the science communicators are in a position of power - they will suffer from the same deficiencies in perspective-taking that a boss might have relative to employees with "low status" (listen to the interview for context - at around 40 minutes in, the discussion of research on "perspective-taking"). A good science-communicator needs to get into the head of the target audience. He/she needs to see the subject from the perspective of the audience member.. He/she can't make assumptions about what the audience does or doesn't understand. And he/she must understand where the audience is coming from in their self-perception of being w/o power. This is in the interview with the discussion of the effectiveness of explicit work to lower one's sense of power as a way to make someone more persuasive.

March 4, 2013 | Unregistered CommenterJoshua

@Joshua: not clutter! More after I absorb!

March 4, 2013 | Unregistered Commenterdmk38

"one thing that can help is to try to find people who you believe share your worldview but who disagree w/ the view that prevails within it & ask them why they believe what they do."

Yes, possibly. Although they will tend to share my blindspots, too.

"This is related to NiV's conceptualization, but it is a modification: Entering discussions with others only helps if the developmental groundwork for learning how to engage in critical analysis has been laid"

Yes, agreed. It's often not learnt very well.

"One thing that I am struck with, when looking at the characteristics of these debates, is the ubiquity of a scorched earth approach, or a zero sum gain mindset. I think it is important to reverse engineer from those phenomena, and to understand not only what compels people towards such an approach (motivated reasoning)"

Motivated reasoning is one of the mechanisms - but is it the only one?

"Whether or not that perception of "power" in the climate debate is accurate - it is undeniable that "skeptics" view themselves as those w/o "power""

Yes, that's probably the case so far as the perception goes. Although it's not clear to me that the "realists" think they have much more, either.

However, I don't see that lowering the perception of your own power would make one more persuasive. The issue is that powerlessness causes frustration, annoyance, and a confrontational/adversarial attitude. You know that no matter what you say, they're going to go ahead with their plans anyway, so there's no motivation to compromise or negotiate or be reasonable.

Neutralising that doesn't mean reducing or hiding ones own power (even assuming they wouldn't see through that), but instead requires *sharing* or *granting* power. Make it clear that you're listening, and that if they can't be persuaded that you really will stop what you're doing. Make it clear that if they ask for more information that you'll provide it, that if they come up with a really good suggestion you'll follow it. And give evidence of that, by giving examples where you *did* stop as a result of protests, and explaining how they can achieve it too. But at the same time, tell them that you have your own interests to look out for, you need to be able to provide a strong case to your own side if you're going to stop, and you want to jointly find a solution that works for everyone.

Now they have a chance to win something, and therefore may be more prepared to surrender on something else in order to get it. The problem is the feeling of powerlessness, not the inequality. That might have been what you meant, but it wasn't clear.

March 5, 2013 | Unregistered CommenterNiV

Dan I would say that what NiV and Joshua have demonstrated is that the listening part did not occur in certain civc transactions such as climate change.

Another thing I would say is that gettting into the head of the target audience is the OODA loop.

I disagree with Joshua's statement "This means that it is incumbent on science communicators to address that perception of power, explicitly." I think this is a misconception. This thought of necessary action would not be necessary if the listen paradigm had been employed. This is basic stuff from S&F on risk management communication, if done correctly. I would conclude that Joshua is offering evidence that the ask and listen should have occurred earlier in such discussions as climate change and gun control.

An example of this I found in all places(!?!) the NRA's magazine. In the article, the author makes a polemic that Obama in gun control is insisting that a conversation be started, but what is really meant is that Obama is telling gun owners to shut up and exclude them from the conversation. It makes for a good read. It is a showcase how motivated reasoning on someone's part (NRA) does NOT mean that someone else (Obama) can reverse the tables using motivated reasoning and not set themselves up to be used as motivation for such as the NRA. The article was in the month where NRA asks you to vote.

Disclaimer I have given money to both NRA and ACLU, even Democrats and Republicans when it suits my reasoning. I once sent the Republican National Committee one penny taped to their postage paid solicitation for funds.

March 5, 2013 | Unregistered CommenterJohn F Pittman

Motivated reasoning is one of the mechanisms - but is it the only one?

I'm looking at motivated reasoning not as a mechanism, but as a description of an underlying and inherent attribute in how we reason - not a mechanism in itself. The mechanism that I'm thinking about is the process of filtering information selectively rather than reconstructing the starting thesis. I distinguish that process from the tendency to want to confirm social or cultural identifications through positioning on controversial issues. The mechanism (the "how" of drawing conclusions that realize our "motivations") s a manifestation of motivated reasoning. Those distinctions may be too arbitrary/subective/vague; it helps me, however, to work with that conceptualization.

However, I don't see that lowering the perception of your own power would make one more persuasive.

Try listening to the interview. The argument isn't for a direct relationship between lowering perception of power and becoming more persuasive. Being able to see other people's perspective is the necessary mediator in the cause-effect relationship. The concept is that people of "lower status" are much better at understanding the perception of others. The guy in the interview says that there is a research literature that backs that argument.

I guess I'd wonder a bit about the claim. I'm not sure why, for example, someone of "low status" would be better at understanding the perspective of someone of "high status" than visa versa. However, it does kind of jibe with my experience. I tend to believe that with power comes a kind of arrogance as well as a kind of indifference to the perspective of others. With power, understanding the perspective of others becomes less of a needed survival skill. That isn't to say, however, that there aren't outliers - those with power who are particularly skilled at understanding the perspectives of others. And perhaps for those outliers, their exceptional skills in that regard were a contributing factor in their ability to acquire power. It would be interesting to read the related literature.

This is a very different notion than yours of sharing or granting power. Those descriptors seem to reinforce the notion that power is in the possession of one person who can then decide to share or grant. Seems to me that such a notion is antihetical to the effect as described in the interview - an effect that would only come about if people are forced (through a perception of loss of superior power) to view others as equally powerful and thus be forced to understand the perspective of others as a kind of survival necessity.

It is interesting to note that in models of stakeholder dialog - a key component is a structure of leveled hierarchy. "Experts," should be, by definition, just another stakeholder. The external structure "forces" a level playing field.

Although it's not clear to me that the "realists" think they have much more, either.

Self perception of holding less power (being a "victim") seems to me to be endemic to the tribal nature of these debates. Both sides see themselves as "victims" of media bias, victims of the rich and powerful, etc.

March 5, 2013 | Unregistered CommenterJoshua

Joshua states "Self perception of holding less power (being a "victim") seems to me to be endemic to the tribal nature of these debates. Both sides see themselves as "victims" of media bias, victims of the rich and powerful, etc." I have to agree with the victimization, but is it only personal?

An example are public goods such as the environment or rights. These are often claimed to be victimized by the opposing forces.

I wonder if they are the same phenomena. Joshua what do you think?

March 7, 2013 | Unregistered CommenterJohn F Pittman

JFP -

I disagree with Joshua's statement "This means that it is incumbent on science communicators to address that perception of power, explicitly." I think this is a misconception. This thought of necessary action would not be necessary if the listen paradigm had been employed. This is basic stuff from S&F on risk management communication, if done correctly. I would conclude that Joshua is offering evidence that the ask and listen should have occurred earlier in such discussions as climate change and gun control.

Could be. It's a bit of a chicken and egg situation, IMO. a precondition for speaking in a way that people will listen is that the audience be intent on listening. If they enter the room with a self-perception of being a victim, with a perception of the speaker as an "other" who holds power, then it won't matter how you talk. IMO (speaking theoretically), you need a communicative paradigm that forces equality. Of course, people with an inherent distrust can manage to view themselves as victims in virtually any communicative paradigm.

Maybe a good example is the blogosphere. I just read a Willis rant about a "realist" blogger cutting out references to "warmists" as being "censorship." IMO, so typical of the climate wars, that's false victimization writ large - no one's free speech is infringed upon by that action. It isn't "censorship" to be denied a privilege, IMO. (This is particularly funny in this case as I have been moderated into non-existence at WUWT because I challenged Anthony on a false charge that he made against me. I'm not being "censored," and it would be silly for me to claim such).

So why is the sense of "victimization" so abundant in the blogosphere - even though the claim is ridiculous? IMO, because of the perception of an imbalance in power. If my post is trash-binned, I can feel justified in my sense of victimization. How many times have you seen, in the blogosphere, commenters complain about being victimized when their comments are snipped? It is almost inevitable.

The only way to prevent that is if you create an external structure where no one's comments are trash-binned - but the, of course, you have a set of follow-on problems.

I wonder if they are the same phenomena. Joshua what do you think?

Sure. Moral integrity is seen as victimized by both sides, respectively. The intentions of the founding fathers are seen as victimized by both sides, respectively. The world's poor are seen as being victimized by both sides, respectively.

March 8, 2013 | Unregistered CommenterJoshua

I have to agree about the fake victimization in blog wars. But then I find the most damnable aspect of rants is that they are generally boring, besides being low on signal. It is rare to find a good humorous one. Most are dull and humourless.

As to the chicken and egg: considering Dan's post about the UCLA lecture, I think that the idea to avoid lose/lose situations by strategic employment of good prep work is a good one. I don't think it so much a matter of chicken and egg, as much as shutting the barn door after the horse has left. Read the article and comments; and respond there if you wish.

And it is good that we agree to some thoughts.

March 10, 2013 | Unregistered CommenterJohn F Pittman

PostPost a New Comment

Enter your information below to add a new comment.

My response is on my own website »
Author Email (optional):
Author URL (optional):
Post:
 
Some HTML allowed: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <code> <em> <i> <strike> <strong>