follow CCP

Recent blog entries
popular papers

What Is the "Science of Science Communication"?

Climate-Science Communication and the Measurement Problem

Ideology, Motivated Cognition, and Cognitive Reflection: An Experimental Study

'Ideology' or 'Situation Sense'? An Experimental Investigation of Motivated Reasoning and Professional Judgment

A Risky Science Communication Environment for Vaccines

Motivated Numeracy and Enlightened Self-Government

Ideology, Motivated Cognition, and Cognitive Reflection: An Experimental Study

Making Climate Science Communication Evidence-based—All the Way Down 

Neutral Principles, Motivated Cognition, and Some Problems for Constitutional Law 

Cultural Cognition of Scientific Consensus
 

The Tragedy of the Risk-Perception Commons: Science Literacy and Climate Change

"They Saw a Protest": Cognitive Illiberalism and the Speech-Conduct Distinction 

Geoengineering and the Science Communication Environment: a Cross-Cultural Experiment

Fixing the Communications Failure

Why We Are Poles Apart on Climate Change

The Cognitively Illiberal State 

Who Fears the HPV Vaccine, Who Doesn't, and Why? An Experimental Study

Cultural Cognition of the Risks and Benefits of Nanotechnology

Whose Eyes Are You Going to Believe? An Empirical Examination of Scott v. Harris

Cultural Cognition and Public Policy

Culture, Cognition, and Consent: Who Perceives What, and Why, in "Acquaintance Rape" Cases

Culture and Identity-Protective Cognition: Explaining the White Male Effect

Fear of Democracy: A Cultural Evaluation of Sunstein on Risk

Cultural Cognition as a Conception of the Cultural Theory of Risk

« WSMD? JA! Political outlooks & Ordinary Science Intelligence | Main | Science of Science Communication seminar: Session 2 reading list »
Monday
Jan232017

Presentation jeopardy: here's the answer; what's the question?

It's obviously a problem if one's research strategy involves aimlessly collecting a shitload of data and then fitting a story to whatever one finds.

But for a presentation, it can be a fun change of pace to start with the data and then ask the audience what the research question was. I'll call this the "Research Presentation 'Jeaopardy Opening.' "

I tried this strategy at the the Society for Personality and Social Psychology meeting panel I was on on last Saturday. If I hadn't been on a 15-min clock -- if, say, the talk had been a longer one for a paper workshop or seminar -- I'd have actually called on members of the audience to offer and explain their guesses. Instead I went forward indicating what questions I, as the Alex Trubek of the proceedings, would count as "correct."

But there's no such constraint here on the CCP Blog.  So consider these slides & then tell me what question you think the data are the answer too! For my answers/questions, check out the entire slide show.

Slide 1:

 

Slide 2:

 

Slide 3:

 

Slide 4:

 

Slide 5:


Slide 6:

 

PrintView Printer Friendly Version

EmailEmail Article to Friend

Reader Comments (15)

What is the percentage of political affiliation as the predictor of a stance on global warming. i.e. how good a predictor is one’s political view on one’s views on an issue like global warming.

January 23, 2017 | Unregistered Commentermcouthon

@mccouthon-- very close.

January 23, 2017 | Registered CommenterDan Kahan

It's obviously a problem if one's research strategy involves aimlessly collecting a shitload of data and then fitting a story to whatever one finds.

Really? I thought that was the whole point of the big data revolution. It used to be wasteful to do that, but that's -not- a problem any more. Now that's just being honest - more honest than going into the data with any particular research question in mind at all.

January 23, 2017 | Unregistered Commenterdypoon

In an arguably "polluted" science communication environment such as global warming, does science comprehension mitigate perceptions of risk associated with political ideology?

January 23, 2017 | Unregistered CommenterLynn Davey

==> Really? I thought that was the whole point of the big data revolution. It used to be wasteful to do that, but that's -not- a problem any more. Now that's just being honest - more honest than going into the data with any particular research question in mind at all. ==>

Why would either scenario be inherently superior?

I don't see why there would be anything inherently less than 100% "honest" about coming up with a theory and then collecting data to test it, nor do I see why there would be anything inherently less than 100% scientific about researching and collecting data on a topic and then constructing theories to explain the data.

It seems to me that either way, what matters most is that you design an analytical paradigm that controls for variables and tests for alternative explanations, moderator/mediator effects, representativeness of sampling, etc. Lacking such processes, neither approach stands. Along with such processes, either approach can net valid results.

January 23, 2017 | Unregistered CommenterJoshua

It seems to me that either way, what matters most is that you design an analytical paradigm that controls for variables and tests for alternative explanations, moderator/mediator effects, representativeness of sampling, etc. Lacking such processes, neither approach stands. Along with such processes, either approach can net valid results.

Joshua, you state my point better than I did. In my view, the entire point of the scientific method is to draw meaningful conclusions while avoiding confirmation bias. Coming up with a theory, then testing it, is inherently motivated inquiry, and that's one tool in the toolbox. "Aimlessly" collecting data is less inherently motivated and is another tool in the toolbox. I think neither method is superior to the other.

It's for that reason that I was objecting to Dan's characterization of "aimless" data collection as "obviously a problem". It's not, and I think you agree with me.

January 23, 2017 | Unregistered Commenterdypoon

@Lynn-- also very close.

Hint for you & @Mccouthon--

There's a psychometric issue here

January 23, 2017 | Registered CommenterDan Kahan

Ahhh...I see what you're getting at. As in, does this measure what it seems to be measuring...or something else? Here's my final answer (oops. wrong show).

"What psychological construct does a study of global warming beliefs by political ideology (that controls for sci comprehension) actually measure?"

Answer - identity-protective cognition

??

January 23, 2017 | Unregistered CommenterLynn Davey

@Lynn--- you got it! Or 95% of it. Climate change beliefs are indicators of some form of identity -- political identity is good enoug for this purpose. They behave that way in the same sense that answers to questions about liberal ideology & party id do. So rather than political outlooks being predictors of climte change belifs, the beliefs are measureing same thing as whatever we consult to measure political outlooks.

the other 5% has to do w/ knowledge, which you alluded to before. Measurs of critical thinking make climate beliefs even more reliable indicators of what sort of person one is-- just as they would tend to do that w/ anything else that is a valid indicator of identity.

Actually there were a couple of additional questions that were "right" for the answers in the data too. YOu'll see them if you look at slide show

January 23, 2017 | Registered CommenterDan Kahan

Dypoon -

==> Coming up with a theory, then testing it, is inherently motivated inquiry, and that's one tool in the toolbox. ==>

Agreed. And I didn't mean to diminish what I perceived your larger point to be: That perhaps the risk is greater for biases - such as confirmation bias - when the theory precedes the data collection.

But then again, it is important (IMO) to stress that, "motivated inquiry" does not necessarily imply "motivated reasoning." So often such an assumption is operative and (IMO) counterproductive - especially when we see that assumption underpinning identity-aggressive tribalism. That happens often - a good example being when some "skeptics" reinforce their ideological identity by leveraging an assumption that motivated inquiry into the impact of ACO2 equals motivated reasoning among researchers (of course, only if those researchers conclude that ACO2 has a significant impact on the climate).

January 23, 2017 | Unregistered CommenterJoshua

Dan -

==> Measurs of critical thinking make climate beliefs even more reliable indicators of what sort of person one is... ==>

This, it seems to me, to be flawed in that it assumes that how one reasons in one context generalizes across context to describe "what sort of person one is."

I think that context tends to be an important moderater between the relationship between belief and reasoning.

January 23, 2017 | Unregistered CommenterJoshua

Dan -

It occurs to me that maybe you meant "what sort of person you are" in the sense of "who you are" as in "beliefs about climate change tells you more about who someone is than about what that person knows."

In which case I wouldn't object to the phrasing of "what sort of person you are."

January 23, 2017 | Unregistered CommenterJoshua

Dan - I had not thought of the science comprehension piece in quite that way before. That is, "Measures of critical thinking make climate beliefs even more reliable indicators of what sort of person one is-- just as they would tend to do that w/ anything else that is a valid indicator of identity." Wow. Yeah.

Your work is so fascinating and important. I refer to it with some frequency when I train policy advocates in how to communicate more effectively about social problems (ok. well mostly i teach them about how people reason about social problems and say, "now what the hell are you going to do?").

Will check out the slide show. Thank you for sharing your work!

January 23, 2017 | Unregistered CommenterLynn Davey

@Joshua-- I did mean about "sort of."

Your & @dypoons defenses of "big data" are more complicated & wasn't I meant to be alluding to in 1st sentence (a "shitload" of data is much smaller unit than that).

But copnsider Gary King's perspective

January 24, 2017 | Registered CommenterDan Kahan

Not being where I wanted to be YouTube noisy, I accessed Gary King's Big Data analysis here:

http://gking.harvard.edu/files/gking/files/evbase-shanghai.pdf

Scrolling down to the filled out slides.

One of special interest to me is this one on China's 50 Cent Party, (trolls hired by Chinese propaganda authorities in an attempt to manipulate public opinion to the benefit of the Chinese Communist Party), which I believe to have a lot of relevance to our own political situation:

Reverse Engineering China’s “50c Party”

• Prevailing view of scholars, activists, journalists, social media
participants: 50c party argues against those who criticize the
government, its leaders, and their policies

Wrong!

• Fabricates 450M social media posts a year!
• Does not argue; does not engage on controversial issues
• Distracts; redirects public attention from criticism and central
issues to cheerleading and positive discussions of valence
issues

While our own social media contains quite a bit of jousting, that is aimed at issues that often are, IMHO, frequently peripheral to what ought to be the main focus of policy matters, and diversionary to focusing in on what ought to be the main policy thrust. Also significant, in my opinion, is that the jousting tends to highlight the extremes of opposing sides and make it harder to find common ground and constructive compromises leading to effective solutions. Keeping the public in turmoil also serves to divert attention away from the concentration of wealth and power in the hands of a small oligarchy.

IMHO, our recent national election needs to be evaluated in these terms.

January 25, 2017 | Unregistered CommenterGaythia Weis

PostPost a New Comment

Enter your information below to add a new comment.

My response is on my own website »
Author Email (optional):
Author URL (optional):
Post:
 
Some HTML allowed: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <code> <em> <i> <strike> <strong>