follow CCP

Recent blog entries
popular papers

What Is the "Science of Science Communication"?

Climate-Science Communication and the Measurement Problem

Ideology, Motivated Cognition, and Cognitive Reflection: An Experimental Study

'Ideology' or 'Situation Sense'? An Experimental Investigation of Motivated Reasoning and Professional Judgment

A Risky Science Communication Environment for Vaccines

Motivated Numeracy and Enlightened Self-Government

Ideology, Motivated Cognition, and Cognitive Reflection: An Experimental Study

Making Climate Science Communication Evidence-based—All the Way Down 

Neutral Principles, Motivated Cognition, and Some Problems for Constitutional Law 

Cultural Cognition of Scientific Consensus
 

The Tragedy of the Risk-Perception Commons: Science Literacy and Climate Change

"They Saw a Protest": Cognitive Illiberalism and the Speech-Conduct Distinction 

Geoengineering and the Science Communication Environment: a Cross-Cultural Experiment

Fixing the Communications Failure

Why We Are Poles Apart on Climate Change

The Cognitively Illiberal State 

Who Fears the HPV Vaccine, Who Doesn't, and Why? An Experimental Study

Cultural Cognition of the Risks and Benefits of Nanotechnology

Whose Eyes Are You Going to Believe? An Empirical Examination of Scott v. Harris

Cultural Cognition and Public Policy

Culture, Cognition, and Consent: Who Perceives What, and Why, in "Acquaintance Rape" Cases

Culture and Identity-Protective Cognition: Explaining the White Male Effect

Fear of Democracy: A Cultural Evaluation of Sunstein on Risk

Cultural Cognition as a Conception of the Cultural Theory of Risk

« Let's keep discussing Ludwick! | Main | I ♥ NCAR/UCAR--because they *genuinely* ♥ communicating science (plus lecture slides & video) »
Wednesday
Apr022014

MAPKIA! Episode 49: Where is Ludwick?! Or what *type* of person is worried about climate change but not about nuclear power or GM foods?

Time for another episode of Macau's favorite game show...: "Make a prediction, know it all!," or "MAPKIA!"!

By now all 14 billion regular readers of this blog can recite the rules of "MAPKIA!" by heart, but here they are for new subscribers (welcome, btw!):

I, the host, will identify an empirical question -- or perhaps a set of related questions -- that can be answered with CCP data.  Then, you, the players, will make predictions and explain the basis for them.  The answer will be posted "tomorrow."  The first contestant who makes the right prediction will win a really cool CCP prize (like maybe this or possibly some other equally cool thing), so long as the prediction rests on a cogent theoretical foundation.  (Cogency will be judged, of course, by a panel of experts.) 

Okay—we have a real treat for everybody: a really really really fun and really really really hard "MAPKIA!" challenge (much harder than the last one)!

The idea for it came from the convergence of a few seemingly unrelated influences.

One was an exchange I had with some curious folks about the relationship between perceptions of the risks of climate change, nuclear power, & GM foods.

Actually, that exchange already generated one post, in which I presented evidence (for about the umpteenth time) that GM food risks perceptions are not politically or culturally polarized in the U.S., and indeed, not even part of the same “risk perception family” (that was the new part of that post) as climate and nuclear.  

Responding to this person’s (reasonable & common, although in fact incorrect) surmise that GM food risk perceptions cohere with climate and nuclear ones, I had replied that it would be more interesting to see if it were possible to “profile” individuals who are simultaneously (a) climate-change risk sensitive, and (b) nuclear-risk and (c) GM food risk skeptical.

Right away, Rachel Ludwick (aka @r3431) said, “That would be me.”

So I’m going to call this combination of risk perceptions the “Ludwick” profile.

Why should we be intrigued by a Ludwick?

Well, anyone who is simultaneously (a) and (b) is already unusual. That’s because climate change risks and nuclear ones do tend to cohere, and signify membership in one or another cultural group.

In addition, the co-occurrence of those positions with (c)—GM food risk skepticism—strikes me as indicating a fairly discerning and reflective orientation toward scientific evidence on risk.

Indeed, one doesn’t usually see discerning, reflective orientations that go against the grain, culturally speaking.

On the contrary, higher degrees of reflection—as featured in various critical reasoning measures—usually are associated with even greater cultural coherence in perceptions of politically contested risks and hence with even greater political polarization.

A Ludwick seems to be thoughtfully ordering a la carte in a world in which most people (including the most intelligent ones) are consistently making the same selection from the prix fixe menu.

That is the second thing that made me think this would be an interesting challenge.  I am interested in (obsessed with) trying to identify dispositional indicators that suggest a person is likely to be a reflective cultural nonconformist.

Unreflective nonconformits aren’t hard to find. Indeed, being nonconformist is associated with being bumbling and clueless.

As I’ve explained 43 times before, it’s rational for people to fit their perceptions of risk to their cultural commitments, since their stake in fitting in with their group tends to dominate their stake in forming “correct” perceptions of societal risk on matters like climate change, where one’s personal views have no material effect on anyone’s exposure to the risk in question.

Accordingly, failing to display this pattern of information processing could be a sign that one is socially inept or obtuse.  That’s one way to explain why people who are low in critical reasoning capacities tend to be the ones most likely to form group-nonconvergent beliefs on culturally contested risks (although even for them, the “nonconformity effect” isn’t large).

It would be more interesting, then, to find a set of characteristics indicates a reflective disposition to form truth-convergent (or best-evidence convergent) rather than group-convergent perceptions of such risks.  I haven’t found any yet. On the contrary, the most reflective people tend to conform more, as one would expect if indeed this form of information processing rationally advances their personal interests.

As I said, thought, the Ludwick combination of risk perceptions strikes me as evincing reflection.  Because it is also non-conformist with respect to at least two of its elements (climate-risk concerned, nuclear-risk skeptical),  being able to identify Ludwicks might lead to discovery of the elusive “reflective non-conformity profile”!

The last thing that influenced me to propose this challenge is another project I’ve been working on. It involves using latent risk dispositions to predict individual perceptions of risk.  The various statistical techniques one can use for such a purpose furnish useful tools for identifying the Ludwick profile.

So everybody, here’s the MAPKIA:

What “risk profiling” (i.e., latent disposition) model would enable someone to accurately categorize individuals drawn from the general population as holding or not holding the Ludwick combination of risk preferences?

Let me furnish a little guidance on what a “successful” entry in this contest would have to look like and the criteria one (that one being me, in particular) might use to assess the same.

To begin with, realize that a Ludwick is extremely rare.  

For purposes of illustration, here’s a scatter plot of the participants in an N = 2000 nationally representative survey arrayed with respect to their global warming and nuclear power risk perceptions, indicated by their responses to the “industrial strength risk perception measure” (ISRPM).

click me if you want to see the scatterplot w/o the text & arrows & such!I’ve color coded the respondents with respect to their GM food risk perceptions, measured the same way: blue for “skeptical” (≤ 2), mud brown for “neutral” (3-5), and red for “sensitive” (≥ 6).

So where is @r3431, aka “Rachel Ludwick”?!

Presumably, she’s one of the blue observations within the dotted circle.

The circle marks the zone for “climate change risk sensitive” and “nuclear risk skeptical,” a space we’ll call the “Ropeik region.”

A “Ropeik,” who will be investigated in a future post, is a type who is very worried about climate change but regards the water used to cool nuclear reactor rods as a refreshing post-exercise drink.  The Ropeik region is very thinly populated--not necessarily on account of radiation sickness but rather on account of the positive correlation (r = 0.47, p < 0.01) between global warming concerns and nuclear power ones.

The correlation  between worrying about global warming & worrying about GM foods is quite modest (r = 0.26, p < 0.01) .

But there definitely is one.

Accordingly, someone who is GM food risk skeptical is even less likely to be in the Ropeik region (where people are very concerned about climate change) than somewhere else.

Those are the Ludwicks.  They exist, certainly, but they are uncommon.

Actually, if we define them as I have here in relation to the scores on the relevant ISRPMs, they make up about 3% of the population.

Maybe that is too narrow a specification of a Ludwick? 

For sure, I’ll accept broader specifications in evaluating "MAPKIA!" entries—but only from entrants who offer good accounts, connected to cogent theories of who these  Ludwicks are, for changing the relevant parameters.

Of course, such entrants, to be eligible to win the great prize (either this or something like it) to be awarded to the winner of this "MAPKIA!" would also need to supply corresponding “profiling” models that “accurately categorize” Ludwicks.

What do I have in mind by that?

Well, I’ll show you an example.

I start with a “theory” about “who fears global warming, who doesn’t, and why.”  Based on the cultural theory of risk, that theory posits that people with egalitarian and communitarian outlooks will be more predisposed to credit evidence of climate change, and people—particularly white males—with hierarchical and individualistic outlooks more predisposed to dismiss it. 

Because these predispositions reflect the rational processing of information in relation to the stake such individuals have in protecting their status within their cultural groups, my theory also posits that the influence of these predispositions will be increase as individuals become more “science comprehending”—that is, more capable of making sense of empirical evidence and thus acquiring scientific knowledge generally.

A linear regression model specified to reflect that theory explains over 60% of the variance in scores on the global warming ISRPM.

I can then use the same variables—the same model—in a logistic regression to predict the probability that someone is a “climate change believer” (global warming ISRPM  ≥ 6) and the probability someone is a “climate change skeptic” (global warming ISRPM  ≤ 2).

(Someone who read this essay before I posted it asked me a good question: what’s the difference between this classification strategy and the one reflected in the  popular and very interesting “6 Americas” framework? The answer is that the “6 Americas scheme” doesn't predict who is skeptical, concerned, etc. Rather, it simply classifies people on the basis of what they say they believe about climate change. A latent-disposition model, in contrast, classifies people based on some independent basis like cultural identity that makes it possible to predict which global warming "America" members of the general population live in without having to ask them.)

Classifying someone as one or the other so long as he or she had a predicted probability > 0.5 of having the indicated risk perception, the model would enable me to determine whether someone drawn from the general population is either a "skeptic" or a "believer" (your choice!) with a success rate of around 86% for “skeptics” and 80% for “believers.” 

How good is that?

Well, one way to answer that question is to see how much better I do with the model than I’d do if the only information I had was the population frequency of skeptics and believers.

“Skeptics” (ISRPM ≤ 2) make up 26% of my general population sample. Accordingly, if I were to just assume that people selected randomly from the population were not “skeptics” I’d be “predicting” correctly 74% of the time.

With the model, I’m up to 86%--which means I’m predicting correctly in about 46% of the cases in which I would have gotten the answer wrong by just assuming everyone was a nonskeptic.

“Believers” (global warming ISRPM ≥ 6) make up 35% of the sample.  Because I can improve my “prediction” proficiency relative to just assuming everyone is a nonbeliever from 65% to 80%, the model is getting the right answer in 42% of the cases in which I’d have had gotten the wrong one if the only guide I had was the “believer” population frequency.

Those measures—46% and 42%--reflect the “adjusted count R2” measure of the “fit” of my classification model.

There are other interesting ways to assess the predictive performance of these models, too—and likely I’ll say more about that “tomorrow.”

But “how good” a predictive model is is a question that can be answered only with reference to the goals of the person who wants to use it.  If it improves her ability relative to “chance,” does it improve it enough, & in the way one careas about (reducing false positives vs. reducing false negatives),  to make using it worth her while?

But for now, consider GM food risk perceptions.

As I’ve explained a billion times, one won’t do a very good job “profiling” someone who is GM food risk sensitive or GM food risk-skeptical by assimilating GM food risks to the “climate change risk family.”

If I use the same latent predisposition model for GM food risk perceptions that I just applied for global warming risk perceptions, I explain only 10% of the variance in the GM food ISRPM (as opposed to over 60% for global warming ISRPM).

When I try to predict GM food risk “skeptics” (ISRPM ≤ 2) and GM food risk “believers” (ISRPM ≥ 6), I end up with correct-classification rates of 79% and 71%, respectively.

That might sound good—but it isn’t.

In fact, that sort of “predictive proficiency” sucks. 

GM food “skeptics” make up 22% of the population—meaning that 78% of people are not skeptical.  My 79% predictive accuracy rate has an adjusted count R2 of 0.03, and is likely to be regarded as pitiful by anyone who wants to do anything, or at least anyone who wants to do something besides publish a paper with “statistically significant” regression coefficients (I've got a bunchin my GM food "skeptic" model--BFD!), on the basis of which he or she misleadingly claims to be able to “explain” or “predict” who is a GM food risk skeptic!

For GM food “believers,” my 71% predictive accuracy compares with a 70% population frequency (30% of the sample are “believers,” defined as ISRPM ≥ 6).  An adjusted count R2 of 0.02: Woo hoo!  (Note again that my model has a big pile of “statistically significant” predictors—the problem is that the variables are predicting variance based on combinations of characteristics that don’t exist among real people).

In sum, we need a different theory, and a different model, of who fears what & why to explain GM food risk perceptions.

I don’t have a particularly good theory at this point.

But I do have a pile of hunches.

They are ones I can test, too, with potential indicators that I’ve featured in previous posts. These include

In constructing their Ludwick models, "MAPKIA!" entrants might want to consult those posts, too.

I’ll say more how I would use them to predict GM food risks “tomorrow,” when I post (or post the first) report on the MAPKIA entries.

So …on you marks… get set …

MAPKIA!

 


PrintView Printer Friendly Version

EmailEmail Article to Friend

Reader Comments (21)

Well, this is definitely cheating, but I looked up Rachael's "about me" section on her website, and comparing her and my backgrounds as a current set of confirmed "Ludwicks," I think I have it: your "reflective nonconformists" are going to be left leaning females who work in the tech industry but have a deep personal interest in the public perception of science. And who also love food/cooking.

Boom.

(I may try to post a real comment later though... some thoughts swimming now..)

April 2, 2014 | Unregistered CommenterJen

This is probably too small a proportion of the population to matter, but I suspect for some communities of practice (e.g. academia), there is great value in being _mostly_ in line with your cultural community but engaging in _some_ deviance (or reflective non-conformance, if you like) to highlight one's intellectual independence. The most interesting academic commentators, in my opinion, are the ones for whom you can't _always_ predict what they are going to say. Their occasional deviance from orthodoxy signals something important about their credibility.

Race or ethnicity, obviously--the lived history of a people and their experiences might create far stronger incentives for solidarity than the CC worldviews. But now I'm just saying things you've already noted, so I'll stop.

April 2, 2014 | Unregistered CommenterRob Robinson

I dont think Ludwicks are rare in the UK. Some quite high profile climateers fit that picture, like Mark Lynas and George Monbiot.

April 2, 2014 | Unregistered CommenterPaul Matthews

Ludwicks are not a rare species. Here in the UK they are quite common. For example, two of our most prominent climate campaigners, Mark Lynas and George Monbiot, are pro-nuclear and pro-GMO.

April 3, 2014 | Unregistered CommenterPaul Matthews

No model here, as of yet, but an important observation, IMO. And, for the record, my statements here in no way reflect any opinion regarding how @r3431's actual decision making process works :). The suppositions you make re; "Ludwicks" depend strongly on the assumption that the Ludwick gives equivalent investigative effort to all the problems you focus on. That is, they have investigated each of the climate change, Nukes, and GM issues sufficiently to make a rational decision on each. I believe it is quite possible, however, that some Ludwicks are actually not equivalently informed on all topics and, hence, only make partial rational decisions on the set of topics. In this way, for example, one might make an evidence informed decision to conclude that nukes are low risk, yet simply follow the group think of high risk on climate due to a lack of knowledge in that area. Any model, therefore, should have some method of accounting for the respondents investigative effort/knowledge of each topic.
A further important consideration is the potential lack of independence of the topics and the variable nature of that relationship relative to one position on the graph. For example, one's high risk concern for the climate could predispose them to favor low risk nuke and GM as those have potential to mitigate climate change effects. A decision on (and predominance of importance for) one topic could lead to default opinions on the other topics.

April 3, 2014 | Unregistered CommenterPdiff

I certainly don't expect this to qualify as a formal entry in the current MAPKIA, but I might as well pose my questions/thoughts here to see what you (Dan) or others might predict.

First, as I mentioned elsewhere, I'm a Ludwick too. Which is to say that I am generally unconcerned about risk to human health from GMOs and nuclear technology, and rather concerned about climate change. I also base my feelings about those potential threats on as much science as I can, because I am a "science person" by nature, so I think if nothing else, that confirms that I'm one of those blue dots in the upper left of that nuclear/climate change correlation plot, and I'd like to imagine I'm not a bumbling non-conformist but one of the more reflective variety. (Do bumbling non-conformists know they are such? Separate and perhaps entirely too philosophical discussion for another time).

So I began by wondering, as Rachael and others did on twitter, what would someone like she and I have in common? I also know my husband falls in this minority. It has been the impetus for many interesting conversations when we chat about which cultural worldview we think we espouse, and then raise all sorts of questions about how we end up breaking the rules and falling out of line with our in-group on some things here or there, whichever way we slice it. I have also found from time to time, when talking about this stuff with other friends and colleagues, the occasional fellow who has a similar reaction- doing a quick mental inventory of all the "hot topics" and assessing where they fall on each issue and then scoring themselves internally against the four cultural worldviews... and then coming to the conclusion that because they don't fall in line with all one quadrant, the whole thing must be bunk. Of course in those situations it's more a challenge to explain that said person may just be an outlier- and the most interesting kind!

Ok, so back to the topic at hand. What do people like Rachael or me or my husband or a few other friends I know- have in common?

Gender & Race

I'm not sure I can come up with any justification for why gender or race would, independently play into this type of person's attitudes and perceptions, aside from the same way it wraps up in a complicated way with all the other attributes. There may be some correlation, but since I can't speak to any underlying driver for that yet, I'll leave this one alone.

Science Comprehension

I want to speculate that Ludwicks will have a higher than average level of science comprehension. But I'm not sure I can base that on any great logic either, so I will hesitate to make that part of a hypothesis. Am I just conflating the idea of being reflective with having good science understanding? My gut instinct is to assume a necessity at arriving at generally-scientifically-supported views, even when they go against one's in-group, would require said person actually themselves posses that above-average science literacy that helps them arrive at the scientifically-supported conclusion?

In that same vein, are there "anti-Ludwicks" in the bottom right corner who are skeptical of climate change but fear nuclear power, (obviously yes, a few), and are any of those people "reflective nonconformists?" or are they all "bumbling" their way into that worldview? Take for a moment the possibility that some reflective nonconformists do arrive at that position... do they score lower on the science comprehension scale? Can someone be highly reflective, but possess minimal levels of science understanding?

I'm not sure but I'd love to find out from the data.

Religiosity & Science Comprehension

I think it makes more sense to think about science comprehension in conjunction with religiosity, if for no other reason than you've already shared some interesting relationships with us on that intersection. If we know that high religiosity partnered with high sci comp reinforces/amplifies culturally expected perceptions, should we assume the Ludwicks out there will not score high on both because they certainly don't fit that pattern? If so, that means they need to either be high-religosity/low-sci comp, or low-religiosity/high-sci comp. (I'm excluding low-religiosity/low-sci comp because I imagine these folks will be generally all over the map- some may fall into the Ludwick persona but maybe only as noise?) I'm inclined again to presume the science comprehension will be higher than average, which implies religiosity will be lower than average. I suppose the opposite could be true- someone with high religiosity and low sci comp- but is someone like that likely to end up embracing the kind of perceptions we're talking about here?

So my hunch about the sci comp being a prerequisite for bucking one's own in-group views to adopt scientifically supported ideas that contradict plays into this as well. I imagine sci comprehension must be higher, and therefore religiosity lower.

I don't think low-religiosity and high-sci comp covers it, though. I imagine there are plenty of people who are just that but see nuclear as a risk. So what else needs to be in the mix to make it likely someone will a) know enough science to align with scientifically supported conclusions, and b) be open minded/reflective enough to go against their cultural-in-group and draw said conclusions?

Reflection

I still don't even know what reflection means. Sure, we have the various tests. But to say someone is generally more reflective versus some other kind of willingness to withhold judgment or tendency to engage in metacognitive self-questioning... are these part of the same quality/characteristic, and if not, are they related or linked? When you say "On the contrary, higher degrees of reflection... usually are associated with even greater cultural coherence in perceptions of politically contested risks and hence with even greater political polarization" I'm imagining the usual system 2 definition.. deliberate and rational. But being deliberate and rational is just a narrow definition of being generally reflective. Maybe we need to reassess some of the terms/vocab- I think in order to be a reflective nonconformist, the "reflective" part of that may not just imply system 2 thinking- maybe that's necessary but not sufficient. I have a hunch (but again this is hardly testable with given data so feel free to ignore it for now) that the nonconformist piece comes from being reflective in a different sense- a willingness to question one's own thinking- metacognition... ?

Anyway, have you explored the relationship between religiosity and reflection? Or science comprehension and reflection? Would be interesting to see all three mapped together.

I also wonder if there isn't some room here to introduce other stuff... not necessarily related to reflection as you characterize it, but thinking about reflection always reminds me of the OCEAN (big 5) personality scales. I imagine the ability for someone to consider these kind of culturally-disapproved views, weigh them in their mind, give them fair consideration despite whatever cognitive dissonance or unpleasant tension they create... would be associated with being pretty far along the "Open mindedness" scale. Similarly, I wonder if being lower on the "Agreeableness" scale (which implies less concern or need to be agreeable or.. etc) might make it more likely someone could be a Ludwick?

Interpretive communities

I can't say much about social deviancy (other than the fact that every person I know who qualifies as a Ludwick is definitely not concerned with social deviancy risks- but I can't really muster a theory for why), but based on how we're defining a Ludwick, it would seem the public risk concern should come out as a wash- the ability to consider public risks based on something other than cultural alignment should imply that Ludwicks might find some public risks a concern, and not others, (i.e. the ones that are scientifically supported as such, and not the ones which aren't?) which in turn I imagine would cause their scores on the public risk scale to cancel out (to an extent... it probably depends on how many of the "public risk" topics are ones aligned with current scientifically supported risk and others with little evidence of risk- if it were equal numbers of each, maybe the perfect Ludwicks out there would score pretty close to 0 on the public risk scale?)

The end

So, again, certainly not a formal prediction because I don't have the statistical chops to be able to recommend how I'd look for these relationships.. but it was a fun mental exercise to imagine how I'd expect them to intersect based on what I already know... I'm curious what others think/agree/disagree on...

I guess to boil it down, the TL;DR version is: here are my predictions for "dispositional indicators" of being a reflective nonconformist:

-low religiosity
-high science comprehension
-maybe higher reflection (whatever that means- not sure I'd trust the usual indicators here)
-more Open-minded? less Agreeable?
-more likely to engage in meta-cognitive processes?
-interpretive community and cultural worldview would probably not have a correlation..?

One other thing to consider- and which I fear we have little way of knowing- is how many people we'd qualify as "reflective nonconformists" are such because of some disposition we can predict with latent qualities, and how many are such because they've had the benefit of being exposed to various narrative frames (like a H-I who read the geoengineering article about climate change and not acknowledges climate change as a risk but still disregards GMOs or nuclear). Considering how external factors may influence one's inclusion in the Ropeik region or being identified as a Ludwick throws a wrench in the gears and I'm not sure how to address that wrench...

April 3, 2014 | Unregistered CommenterJen

@Jen: is it useful to consider where a Ludwick might be on the Social Deviancy x Public Safety risk grid?

April 3, 2014 | Unregistered Commenterdmk38

@Dan

I tried to think it through a bit in that paragraph near the bottom. I'd definitely like to know if ludwicks tend to fall in one region of that grid vs whether the kind of disposition that makes a ludwick a ludwick also implies they will fall across a spectrum (all along the public safety spectrum to be exact) or as I was speculating above, fairly near the middle of the grid (on the public safety axis anyway. Still not sure of social deviancy). Of course I am thinking about a ludwick as a more general type of "reflective nonconformist" and as such, one who might deviate from predicted attitudes aside from the three in question here.

But if we speak solely of a ludwick as someone with this particular combination of attitudes toward climate change, nuclear, and GMOs, it brings up your point from that previous post:

Again, very little if anything can be learned by using a latent-disposition measure to explain variance in the very attitudes that are the indicators of it.

Since climate change and nuclear are two of the very items used to determine ones placement in that grid, does it offer anything useful? Am I misunderstanding by concluding no?

April 3, 2014 | Unregistered CommenterJen

@Pdiff:

What you say suggests 2 points that deserve to be spelled out:

1. Definitely we need to distinguish being a "reflective nonconformist" from holding positions that are culturally nonconforming or eclectic. As I indicated, I think it's likely most of the cultural nonconformists will be what they are as a result (sadly) of a deficiency in that form of social competence that makes people attuned to cultural meanings. A Ludwick, though, is by definition isomeone who has formed her nonstandard package of risk perceptions not in an obtuse or thoughtless way but in a considered one. She might value "getting things right" more highly than most, but she will not be have bumbled into these forms of opposition to or tension with others who share her basic cultural commitemtns.

2. In my view at least, being a "reflective nonconformist" in general & a Ludwick in particular can't depend on being "right" about the issues in which one is forming nonconforming or eclectic set of risk perceptions. It just requires that one have formed one's views through conscious, open-minded engagement with the evidence reasonably at hand & free of the influence of identity-protective or like forms of motivated reasoning. Someone who held the mirror image of Ludwick's risk perceptions -- high nuclear, low global warming, & high GM food risk perceptions--could be just as much a reflective nonconformist as a Ludwick, even though only one of them (at most) could be right. Of course, that mirror image of Ludwick wouldn't be a Ludwick -- he or she would be some other "type" (perhaps a Pdiff?)

April 3, 2014 | Unregistered Commenterdmk38

Mark Lynas has written a book about being climate-concerned and pro-nuclear. It's an entirely rational and logical position if you believe that carbon dioxide emissions are damaging and need to be drastically cut back. Similarly if you think that climate change will threaten food supply, as many do, it's entirely rational to be pro-GM. You can find plenty of articles by him on this.

On both of these issues he changed his mind, having initially gone along with the standard green culture of climate-terrified, anti-nuclear, anti-gm.

If you really want to understand these people, I suggest reading what they have written might be productive.

April 4, 2014 | Unregistered CommenterPaul Matthews

@Paul:

"Cultural anti-confoirmity" wi/ societies must be specified & measured in a manner that is sensitive to cross-cultural differences across them. The risk-perception packages will be nation specific, for sure. Consider, e.g., how different US & UK are on risks posed by badgers.

As you know, we've done a bit of work adapting the CCP worldview scales to UK.

We found that UK & US had similar "group-grid" profiles for global warming.

Nuclear seemed to be less divisive in UK than US. Interestingly, UK didin't seem particularly divided on GM foods either.

Does this sound plausible to you? It's tricky to know whether one has generated cross-culturally valid scales -- so if you told me, "that sounds wrong!," I'd find that significant. Of course, it's also tricky to know whether the view one gets of one's own society is accurate, given the inevitable selection biases involved in our personal observations!

On that score, while I find figures like Monbiot interesting (he in particular is a fascinating & brilliant fellow, whose reflections I very much enjoy considering!), I disagree that taking note of them helps to figure out who fears what & why. These are individuals, for one thing. For anotehr they are defnitely not representative of ordinary people, who are too busy to sit around writing essays & books about these things!

Compiling lists of such people, moreover, doesn't show there are "plenty" of them or give you any reliable insight into their relative frequency. There are probably over 10 million Ludwicks in the US -- that would make for a great list. But they are < 3% of the population.

It's precisely to avoid the distortions that inevitably pervade casual observation that we use the disciplined forms of measurement that are the hallmark of empirical study!

BTW, would you say that Monbiot's view of geoengineering evinces a reflective, culturally non-conformist attitude? ... AS it happens, too, I can tell you a little bit about what empirical investigation discloses about cross-cultural cognition, global warming, & geoengineering

April 4, 2014 | Unregistered Commenterdmk38

@Jen:

(A) Do you think a Ludwick is a type or is a nonconforming member of a type? If latter which one? Which of her views is nonconforming? Or put differently, is your theory one about a group with outlooks that generate shared risk predispositions or about an individual characteristic that generates resistance to such predispositions?

(B) Do you think there are "types" that are disposed to GM food risk sensitivity & ones that are disposed to GM food risk sensitive (given what it seems pretty clear the "types" that feature in conflicts over global warming & nuclear aren't strongly divided on GM foods)? If so, what's that? And what is the relationship, if any, of a Ludwick to those?

April 4, 2014 | Registered CommenterDan Kahan

@Dan

Do you think a Ludwick is a type or is a nonconforming member of a type?

This is tough. As you can probably tell, over the course of writing that other comment and thinking more about it, I think I myself wasn't sure which way makes more sense. I might point the question at you too- in posing the question as you did, which did you imagine? There is probably some value in analyzing those blue dots both ways... but I think it forced to pick, it makes more sense to me to consider them a nonconforming member of a type.

If latter which one? Which of her views is nonconforming?

This is why it's so tough for me to feel comfortable hypothesizing on this... I mean, the fast answer is: maybe she is an egalitarian communitarian who conforms on the climate change issue but resists on nuclear, or maybe she is a hierarchical individualist who resists on the climate change issue and conforms on nuclear power. Is there a way to know? I'd want to know more about how those blue dots perceive a lot of the other risk topics.

Maybe, if you mapped all those blue dots on the public safety/social deviancy grid and they all landed high on the public risk scale, then I'd conclude that Ludwicks are egalitarian communitarians who just are nonconformist on select public risk topics. If they all mapped low on the public risk scale, they'd likely be the opposite- H-I's who are non-conforming on the climate change matter but conformists otherwise.

BUT, I'm still not convinced we could draw good conclusions even if we saw something super crystal clear like that. It wouldn't answer the first question of whether Ludwicks are a nonconforming members of X type or whether they represent a distinct group with "outlooks that generate shared risk predispositions.

Or put differently, is your theory one about a group with outlooks that generate shared risk predispositions or about an individual characteristic that generates resistance to such predispositions?

I think my theory is about the latter. But I admit I don't offer as much evidence for that specific interpretation as I'd like. What I observe myself doing in ruminating about this is generating all kinds of ideas for qualities, characteristics, etc, that would explain the phenomenon of the latter- what things could cause a person who otherwise belongs to a "group" and who, for the most part, shares risk perceptions with that group, but who deviates on certain topics because some innate quality makes it less uncomfortable for them to question/reflect/whatever and occasionally break from the herd.

Then of course it gets me wondering if that's really the case: is said person still a member of that group? Is a non-conforming egalitarian communitarian still an egalitarian communitarian? How much non-conformism can one demonstrate before no longer qualifying as a member of that group?

I get that we can define an E-C with the industrial grade measure, but if someone falls into the E-C camp according to that instrument and then deviates on one or two topics, thats easier to model than a theoretical E-C who deviates on a lot of topics- which would make said person less likely to "score" as an E-C on that instrument, no? This is the thinking behind why I have a hard time drawing a really solid line between non-confirming member of a type vs type in its own right. But maybe that is evidence of my amateur status here- maybe you can shed some light on ideas that make that easier to delineate a boundary between the two concepts?

Do you think there are "types" that are disposed to GM food risk sensitivity & ones that are disposed to GM food risk sensitive (given what it seems pretty clear the "types" that feature in conflicts over global warming & nuclear aren't strongly divided on GM foods)? If so, what's that? And what is the relationship, if any, of a Ludwick to those?

Don't know. Would be interesting to see if that "type" would fall out of some data-crunching. But I'm not sure where to look. I don't know if anything logically falls out of any of my rambling ideas. I kind of doubt it, to be honest. But maybe we could find a small signal if we look at the handful of things I called out as suspects for Ludwicks- religiosity, science comprehension, etc. Maybe the "type" of person who is more sensitive to GMO risk is someone who is low science comprehension, high religiosity?

Since GMO risk sensitivity isn't a polarized issue (in the H-I vs E-C sense), we'd expect to see the "type" of person who worries about GMO risks within both ends of that spectrum. Maybe the GMO sensitive type is a subsection or subtype? Or perhaps multiple subtypes? E-C's who conform on other public risk topics would be expected to do so on GMOs when framed as a public safety issue. So high-religiosity, low science comprehension E-Cs seem like the prime suspects for one group of GMO sensitive types.

But, we know there are some H-Is in the mix and since there isn't much polarization, I have to ask, who are the subsection of H-I's that are sensitive to GMO risks? I would think for a H-I person who be sensitive to GMOs as a matter of public safety/risk, that would qualify as being a non-conforming H-I... (is that a Ludwick? or anti-Ludwick, like electron/positron?) and according to my other theories, a non-conforming H-I might be a higher science comp and lower religiosity... BUT your studies have shown higher science comp causes H-I's to be less sensitive to GMO risks too.

So my other thought would be, maybe the subtype of H-Is who are sensitive to GMOs feel that way based on thinking about GMOs less from a community-safety angle and more from other angle that is threatening to their hierarchical individualist values (what that is I'm not sure).

Maybe a GMO-fearful type is characterized by low-science comprehension and high-public safety concern, regardless of H-I or E-C, (or maybe those characteristics combined with low-religosity if H-I and high-religiosity if E-C)... Then a Ludwick would be their high-science comprehension counterpart/mirror?

I'm just bumbling around at this point.

One other thought, not really in specific reply to your points but just while I'm thinking about it: maybe Ludwicks are just egalitarian individualists? Heh.

April 4, 2014 | Unregistered CommenterJen

Another possibility is of a person who is not influenced by HI/EC issues at all, but simply finds out what the scientific consensus is on any given issue and believes that, just as a matter of principle. The scientific consensus is widely reported on both issues - that nuclear power and genetic engineering are safe and that climate change is a risk. Just take the experts' word for it. If so, then you might expect that pride in their own 'scientific rationality' was a big component in their self-identity, that overrode all the political elements of their self-identity.

That people could simply take the scientific position on everything, no matter whether it went against their politics or community, is widely seen as the ideal, giving the person both a positive self-image and public image. So it's not a big surprise to find people who try to follow that ideal. There are of course several versions of it: - those who see scientific rationality as trusting the experts, and those who take the more Feynmanesque attitude of distrusting experts and finding things out for themselves - but again as a matter of personal pride insisting that evidence overrule their own preferences. Indeed, they may be more aware of the science of personal cognitive biases, and make a point of not falling into that trap.

It's much harder to use this model to try to explain the opposite corner, though. It doesn't really fit either science or politics. Possibly they are people motivated by suspicion of authority, and take the opposite position on everything to the government/establishment line?

Another option is that people may take the consensus position on most subjects, but take a contrary position on some particular topic that they have considerable personal exposure to or interest in. People join single-issue campaign groups for social reasons - their family or their friends are members - and as a result pick up a one-sided understanding of that issue, but don't go any further. I know a few people who were into animal rights and vegetarianism, and as a result picked up on the GM hysteria, but have no knowledge of or opinion on nuclear power or climate change. Other people may well have been involved socially with nuclear protesters, but not have any particular views on climate change. Single-issue campaigns can be very narrow.

As an extension to this idea, it could be that people can belong simultaneously to several different social networks, and can pick up conflicting opinions from the different groups. We may have a different social environment at work, or at the sports club, or at home, or on-line. The other members at the gym may have a different set of opinions to those at the golf club. Not all of the social groups we join are entirely by choice, and people subconsciously try to fit in. Some people will exist at the boundaries between networks.

Or it could just be random spread. The opinion one picks up may be influenced by our social ties or background politics, but they only change the odds - they're not determinative. So some fraction of people will turn up at the other end of the scale by chance. It depends on random events like a chance meeting, or picking up a newspaper to read on the train one day, or turning the TV on and channel-flipping because you're bored. A lot of the scientific facts people 'know' are because of some offhand reference in a book or show.

Or it may be all of the above, or none. I don't think there's enough information to tell.

April 5, 2014 | Unregistered CommenterNiV

@Niv:

Another possibility is of a person who is not influenced by HI/EC issues at all, but simply finds out what the scientific consensus is on any given issue and believes that, just as a matter of principle. The scientific consensus is widely reported on both issues - that nuclear power and genetic engineering are safe and that climate change is a risk. Just take the experts' word for it.

Actually, doesn't that describe the cultural styles that disagree with one another on climate, nuclear, guns etc?

...random spread.....there's [not] enough information to tell.

Yes -- & these are the same things, no? And is it not a presupposition of trying to attain knowledge that "random" and "not enough information" can always be dispelled through disciplined observation & inference?

April 5, 2014 | Registered CommenterDan Kahan

Selfishly, I'd like to think that there was something special about the "women in tech" category that made them especially thoughtful human beings, fully capable of responding intelligently to the available scientific information. Realistically, I think that what advantage that may exist is a result of societal discriminatory patterns that leave women in tech a bit outside the common herd. Thus, any difference in views could be a result of that unique position, a bit out of the dust kicked up by the herd, and not being shoved along the path by the cohort.

What we want to foster, I think, is how to have thoughtful conversations about risk. Encouraging people not to allow themselves to not be simply swept along by whatever herd surrounds them. I've had (online) a number of such conversations with the real Rachael Ludwick, and others. What makes Ludwicks interesting is that they are thoughtfully occupying a middle ground. In that middle ground, it is not that these people are "believers' or "disbelievers", but rather that they are able to separate the science from possible scientific applications and to attempt to evaluate the risks involved in each case. And then to look at the larger picture, and balance relative risks to society in proceeding in one direction or another, or how to go about cherry picking among the best of multiple approaches.

Then, I think it is necessary to differentiate among those who occupy the extremes. And I think that these come from two entirely different categories. Some people, because they are busy, and/or because they have trusted sources that they rely on, are willing to take, at face value, opinions coming from their favorite sources without further individual assessment. They may categorize these opinions in simplistic mental file folders, like GMO = bad. Others, I think are disingenuous, and are using key memes as hooks, as methods to push a larger agenda. For society, in operating as a democracy, it matters who those "trusted sources" are that exert influence on key numbers of citizens and potential voters. And, trying to figure out ways to not only make the best available science, the criteria for trusted source selection, but to expand the size and complexity of the mental filing systems commonly used by members of the public.

As a local example, on the west coast there is much concern, spilling over to what I would describe as hysteria, regarding Fukushima. Specifically, here on the Salish Sea (greater Puget Sound) environmental issues with not fully understood causes (for example, sea star wasting, coho salmon with frequently fatal heart enlargements), are frequently pointed to as examples of "it must be Fukushima". To address those concerns, our local marine resources commission held a public forum, where the invited guest speaker was Dr. Kim Martini, Joint Institute for the Study of the Atmosphere and Ocean, University of Washington and blogger at Deep Sea News. To gather an audience of not already scientifically informed people, the announcement used did not give away what the speakers conclusions might be: http://origin.library.constantcontact.com/download/get/file/1113757207484-97/FukushimaTalk.pdf. In my opinion, this is effective for including in those who show up the very people whom we would most like to inform, those who think that there may be a serious local risk involved with Fukushima. However, IMHO, this approach needs more thought as to what to do about those who merely heard about or read the announcement. Those non attenders may only have their opinions re-enforced. ("did you hear about the Fukushima presentation? Scary stuff, I'm really worried about that, unfortunately, I don't have time to go"). For those that did attend, Kim Martini gave an excellent presentation, one that carefully balanced the real risks and long term consequences for persons very near the Fukushima site with the facts as to why this is not a worry here in Bellingham. People who showed up, are, by the simple act of showing up, and listening with an open mind, predisposed to potentially move themselves into the "Ludwick" category. This demonstrates that with appropriate outreach efforts, some societal opinions can be changed. In the case of nuclear energy, there is plenty of room for scientifically informed discussions well beyond the bounds of "How would you like to have had Tokyo Electric Power Company (TEPCO) or a similar corporation build a nuclear reactor in your town?" and banishment of nuclear energy altogether. These would thoughtfully include balancing very different potential long term consequences, nuclear accidents such as Fukushima, long term nuclear waste disposal and fossil fuel consumption driven climate change. As part of such conversations, we would need to evaluate total energy balance lifecycle considerations for different energy production methods, as frequently, the use of fossil fuels is a hidden component.

I think that a post elsewhere by Matt Nisbett, which I know that Dan Kahan has seen, can also inform this discussion. This has to do with a Gallop poll showing a linkage between economic stress and unwillingness to consider longer term environmental issues. In trying to develop our own "Ludwick" thought processes, I think it important to recognize within ourselves the times in which those might be derailed. Standard crowd psychology, the building of fear, can stampede a herd.

It is off topic to the issues presented here, but still relevant I think in terms of diminishing and minimizing the activation of non-Ludwick thought process, to note how another local crisis is being handled. We have a local measles outbreak. This outbreak is primarily located north of the border, in British Columbia. But both that province and Washington State are handling this in a 'just the facts" manner that does not involve jumping up and down and shouting "antivaxxer" In fact, the word "antivaxxer" seems to be studiously unmentioned. One of the side effects of this is that there is essentially no coverage of the rationale of the (religious) group involved in this. This avoids the creation of another Westboro Congregation or spokesperson along the lines of Jenny McCarthy. There seems to be no "2 sides" media coverage. What is being emphasized is that vaccination is standard, and effective:
From the WA health department: "A person who was confirmed with measles traveled to several western Washington public locations while contagious. Most people in our state are immune to measles, so the public risk is low except for people who are unvaccinated. People who haven’t been vaccinated or aren’t sure if they’re immune should ask a health care professional for advice. " http://www.doh.wa.gov/Newsroom/2014NewsReleases/14047MeaslesMultiCounty.aspx
A local newspaper: http://www.bellinghamherald.com/2013/06/27/3071351/whatcom-county-health-department.html
"Public health officials are urging people who may have been exposed and don't know whether they're immune to check with their doctor. Most people have been vaccinated against the disease."
A Canadian newspaper: http://www.vancouversun.com/health/Measles+outbreak+mostly+contained+Fraser+Valley+visitor+contracted+disease/9682631/story.html
"Mu said the containment of measles to the religious community and the increased rate of vaccination is shielding the community at large."
"Many people from the religious group have chosen to be vaccinated at clinics set up to increase vaccination rates, Mu said."

Ok, my own personal interpretation of how to model Ludwickian behavior is to never actually get around to directly answering the MAPKIA question as posed.

April 5, 2014 | Unregistered CommenterGaythia Weis

As tangential as it may be, I love the reference to Feynmanism. I would consider that the highest compliment. I love everything about how Feynman approached the world. (Except the part of him that probably assumed an information deficit model for modern science communication).

Maybe we add to the model- women in tech who consider themselves in the Feynman camp.

:)

April 5, 2014 | Unregistered CommenterJen

Dan,

"Actually, doesn't that describe the cultural styles that disagree with one another on climate, nuclear, guns etc?"

Not precisely. I think most people on the wrong side of the 'consensus' know it perfectly well - their view is generally that the valid science is on their side, and the consensus has got it wrong. Consensus is just opinion, not truth.

As I've said numerous times, the surveys show that around 80-85% of climate scientists believe the warming is mostly anthropogenic, which is a solid majority. (Quite a lot of sceptics think so, too.) I don't disagree that my view is in the minority, and that what are generally considered the top experts in the field are against me. But science isn't a democracy, and scientific truth is not determined by a vote. I still regard the sceptical position to be scientifically stronger.

"Yes -- & these are the same things, no?"

:-)
In a deep philosophical sense, possibly!

I'm making the distinction between a simple model in which the outcome is a deterministic consequence of just a handful of personality-type variables, we just don't happen to know yet what they are, and the situation where it is the outcome of billions of unknowable influences arising from the Brownian motion of everyday life.

Jen,

"I love everything about how Feynman approached the world. (Except the part of him that probably assumed an information deficit model for modern science communication)."

I think his view was that it wasn't a matter of information deficit - people are given plenty of information about science - but that people weren't given the right mental tools to be able to assess the information for themselves, and so they acquired a shallow and fragile understanding that wasn't much use.

For example, consider his story about when he went to Brazil ( http://v.cx/2010/04/feynman-brazil-education ) or the famous one on assessing school books ( http://www.textbookleague.org/103feyn.htm ). He's not saying you've got to give *more* information, he's saying you've got to give a different sort of information.

Also, I suspect that what the information deficit people are complaining about he wouldn't see as a problem. You're not *supposed* to have everyone agreeing in science. Science relies on dissent and scepticism and checking things out for yourself for it to work. Distrust of experts was what it was all about!

His view was also that if people are interested in science, then great! But if they're not, then that's fine too. I don't think he would have seen it as a problem for there to be many people who didn't understand science. As he put it: "Tell your son to stop trying to fill your head with science - for to fill your heart with love is enough."

April 5, 2014 | Unregistered CommenterNiV

(Small thing: really weird to read my surname over and over ... maybe we should call the group "Brands"? Surely Stewart Brand is used to seeing his surname a lot. :)

I find Robinson's point about some "unexpected" beliefs being a marker of greater believability to be very appealling. Surely one would want to be believed right?

But I have no hypotheses. I could guess using some stories about what cultural values led me to where I am, but are they true? Or would I just be picking convenient (and self-flattering) explanations?

April 5, 2014 | Unregistered CommenterRachael Ludwick

@RachelLudwick:

Well, forget your cutural values, then!

*Why* do you have the views that you do?

Why aren't you more concerned about nuclear power? What would you say to someone who agreed withou about climate but not nuclear & who said: "Fukushima furnishes a nice example of the problems here: the technology is obviously highly dangerous; & even assuming 'acceptably safe' is possible in theory, it's not particularly realistic to assume that we'll see it in practice, given the incentives that big corporate firms have to put profits ahead of human wellbeing and the inevitable capture of govt regulators by the industries they are supposed to be regulating (cf. BP Oil spill!)."

Maybe this person would continue, too, "isn't it best, for the same reasons, to worry about GM foods? And in any case, shouldn't we adopt a 'precautionary' stance? There is uncertainty here; in the face of that, we should resolve uncertainty in favor of safety. That's how you & I feel climate, right?"

April 6, 2014 | Unregistered Commenterdmk38

First may I say how proud I am to be a region on a Kahan chart, and to share this zone with the Ludwicks of the world. Power to Independent Thinkers!!!!

As to the Chronicles of MAPKIA, here is just some of what would have to inform a model that could predict what kinds of folks would be most likely to see things like a Ludwick/Ropeik:

1. The FOFTFY (Find Out First and Think For Yourself) test
Since the brain prefers to laze away in System One ease and not think too hard unless it has to, an identifying characteristic of Ludwicks/Ropeiks would be a higher-than-normal willingness to invest the mental energy of thinking things through carefully and rigorously, rather than simply following the Cultural Cognition pack and defaulting to the tribal view as already proscribed by tribal thought leaders and/or friends. The Cognitive Reflection Test, which measure a willingness to put more mental effort into problem solving, might be one way to measure this propensity, but CRT alone seems too simplistic to capture this quality. Perhaps some sort of survey could be designed to measure how much effort (time, other resources) people put into getting information, or how diverse a range of sources of information people use, or to measure how quickly or slowly people make up their minds about things.

2. The HSDYF (How Safe Do You Feel?)
As I have suggested, Cultural Cognition seems to make sense in adaptive terms. Agreeing with the tribal view not only takes less mental effort but affirms us as a member of the tribe in good standing, which keeps us safe. And social cohesion strengthens the tribe in its combat with other tribes for setting societies operating rules, and when our tribe is dominant (Go Red Sox!) that’s good for our feeling of safety too. So a Ludwick/Ropeik would have to feel generally safe and secure enough that risking being kicked out of the tribe wouldn’t feel TOO threatening.
There are probably ways that personality tests like The Big Five Personality Test (various versions are online, here’s one http://personality-testing.info/tests/BIG5.php) could be modified to identify such characteristics.


3. The ITSI (Independent Thinker Self-Identify) test
A Ludwick/Ropeik would have to think of her/himself as an intellectually independent critical thinker…and this would have to matter enough to them that, to maintain this self-identity, they would be willing to accept the hurt and threat of being rejected by friends and colleagues for holding views contrary to theirs.
A simple set of questions could be devised to give a person an ITSI rating.

3. An RPFI (Risk Perception Factors Influence) rating.
Cultural Cognition is not the sole determinant of how we feel about risks. Psychometric Paradigm research on the psychology of risk perception has identified a wide range of affective characteristics - risk perception factors - that influence how worrisome a risk feels. Is the risk natural or human-made (this informs GMO opposition)? Is it imposed or do we engage in it voluntarily (this informs the demand for GMO labeling)? Do we trust the government agencies that are supposed to protect us from the risk (fear of nuclear power is associated with mistrust of government regulators)? Can it happen to ME (most people don’t think climate change threatens them personally)? How do the harms compare to the benefits (GMO opponents see the environmental harms…supporters see the environmental and human health benefits)? Even the most Ludwickian/Ropeikian independent thinker subconsciously applies these factors to their judgments about what feels scary and what doesn’t (in addition to the influence of Cultural Cognition, particularly on issues which for various reasons have been polarized.)
So the Ludwick/Ropeik model would have to assess how much a person is susceptible to these psychological characteristics. Imagine a study of Risk X. It could identify the relevant RPFs and ask respondents about how much those characteristics influence their feelings about the risk. (Regression analysis could tease out how relevant each RPF was to the respondent’s overall perceptions.) That could establish an overall RPF rating, a measure of how much or how little the respondent’s views of risk are shaped by these emotional factors. A higher or lower RPF score would predict how likely it was that a person would be able to free themselves from these hidden emotional filters and truly think independently, and ‘objectively’, about the issue at hand.

4. The HAB (Heuristics and Biases) test.
Loss Aversion. Optimism Bias. Anchoring and Adjustment. Availability. Representativeness. Probability Neglect. The judgments and views of even the most Ludwickian/Ropeikian independent thinker are subconsciously influenced by all of these cognitive confounders.
Lord only knows how the hell to test this stuff and factor into a predictive model.

So to sum up, my MAPKIA contest entry is the FOFTFY-HSDYF-ITSI-RPFI- HAB model. In other words, given the wide range of variables that research has identified as factors that shape how we see the world, perhaps the most honest answer to the search for a simplified MAPKIA model is to accept the advice found in the wonderful little poem by Dorothy Parker:
When I was young and bold and strong,
The right was right, the wrong was wrong.
With plume on high and flag unfurled,
I rode away to right the world.

But now I’m old - and good and bad,
Are woven in a crazy plaid.
I sit and say the world is so,
And wise is he who lets it go.

April 9, 2014 | Unregistered CommenterDavid Ropeik

PostPost a New Comment

Enter your information below to add a new comment.

My response is on my own website »
Author Email (optional):
Author URL (optional):
Post:
 
Some HTML allowed: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <code> <em> <i> <strike> <strong>