When arguments backfire: Climate change communication and persuasion

by Stephanie Chasteen on June 1, 2011

I wrote before about how skeptics of Obama’s birthright aren’t convinced by a flimsy slip of paper indicating his citizenship — that hitting a belief head-on with data generally doesn’t work.  Instead, arguments are more persuasive when they fit with someone’s previous mental models of how the world works.  Confronting a “birther” with Obama’s birth certificate creates new arguments against the validity of the birth certificate.  But discussing instead how the U.S. needs to stand behind its leader in a time of war may appeal to their mental models of a cohesive nation under siege — as well as allow them a chance to save face.

So, I’ve been thinking a lot about persuasion lately — in part because I’m working on climate change communication and coming up against the discussions of climate change deniers.  In part, too, because I think that in education reform, we have a lot to learn about persuasion.  We tend to try to hit peoples’ ideas about education head-on with data — but all these other fields show that it doesn’t work.  From Mnn.com:

“It would seem,” Mother Jones’ Chris Mooney explains, “that expecting people to be convinced by the facts flies in the face of, you know, the facts.”

A recent blog post on mnn.com — In the persuasion game, beware the backfire effect — argues that not only does this approach not work, it can reinforce a skeptic’s point of view.  This argument comes from neuroscience, interestingly enough.  Take a look at this great article from Chris Mooney on Mother Jones: The Science of Why We Don’t Believe Science.   He quotes the same guy that I discussed in my post last week — Festinger — who has been looking at the science of persuasion for decades.  When we think we’re being reasonable, he argues, we’re often rationalizing our existing point of view.  After all, we’re invested in keeping our view of things — we want to feel that we’re smart, that the world is understandable, and that other people play by our rules.  So it’s easier to rationalize the data to fit with our point of view than to change our point of view.

From the Chris Mooney article:

Sure enough, a large number of psychological studies have shown that people respond to scientific or technical evidence in ways that justify their preexisting beliefs. In a classic 1979 experiment (PDF), pro- and anti-death penalty advocates were exposed to descriptions of two fake scientific studies: one supporting and one undermining the notion that capital punishment deters violent crime and, in particular, murder. They were also shown detailed methodological critiques of the fake studies—and in a scientific sense, neither study was stronger than the other. Yet in each case, advocates more heavily criticized the study whose conclusions disagreed with their own, while describing the study that was more ideologically congenial as more “convincing.”….. In other words, people rejected the validity of a scientific source because its conclusion contradicted their deeply held view… And that undercuts the standard notion that the way to persuade people is via evidence and argument.

Read the original article for more fascinating examples of studies.  You’ll feel like you went through the looking glass.  And I bet you that, even though we think we’re rational — we’d do the same thing as these study subjects!  I’m so curious whether I would really rise above, and be as rational as I think that I am.  Of course, I have extensive training in rationality, I critique studies and conclusions every day, so perhaps such training can override emotional connection.  I wonder.

If you want more on how people convince themselves that climate change doesn’t exist, check out Climate Change Denial, a website run by George Marshall, head of the U.K.-based Climate Outreach Information Network. ”  He’s posted a series of videos from lectures in recent years.

So, Mooney goes on to explain that, not only doesn’t argument work, but it can backfire.  A skeptic can hold their original view more tenaciously in the face of contradictory evidence.  Don’t tell me you haven’t done this before.  I’ve certainly argued my case more strongly when my fiance tries to show me something that flies in the face of what I’m arguing.

Mooney also brings up a compelling point — that we’re becoming more likely to become polarized in our views due to the current prevalence of “narrowcasting” — or the ability to restrict your consumption of the media to people who agree with you, through custom RSS feeds, twitter, podcasts, or other narrow media streams that address a particular ideological stance.  I certainly have been exposed less to conservative viewpoints in the last 5 years.

And again, all this relates to education reform because there are some people who believe very strongly — and are very invested in the idea — that traditional modes of instruction are just fine, thank you.  And all our data isn’t doing much to convince them.  Why?  Seems obvious to me.

Stay tuned…. I’ll be writing more about this in weeks to come.

Image from Iigdar Sagdejev on Wikimedia

{ 1 comment… read it below or add one }

Dallas Raby June 4, 2011 at 5:42 pm

I will be very interested in your future posts on this fascinating topic. It should be obvious to everyone that rational thought does not reign supreme over our beliefs. But people who should be the most rational seem more prone to consider themselves the exception.
Of course, I could be wrong.

Leave a Comment

Previous post:

Next post: