How to debunk false beliefs without having it backfire

How to debunk false beliefs without having it backfire

от Евгений Волков -
Количество ответов: 0
MONDAY, JANUARY 26, 2015

How to debunk false beliefs without having it backfire

This doesn't look good.Shutterstock

There's nothing worse than arguing with someone who simply refuses to listen to reason. You can throw all the facts at them you want, and they'll simply dig in their heels deeper.

Over the past decade, psychologists have been studying why so many people do this. As it turns out, our brains have glitches that can make it difficult to remember that wrong facts are wrong. And trying to debunk misinformation can often backfire and entrench that misinformation stronger. The problem is even worse for emotionally charged political topics -- like vaccines and global warming.

So how can you actually change someone's mind? I spoke to Stephan Lewandowsky, a psychologist at the University of Bristol and co-author of The Debunking Handbook, to find out:

Susannah Locke: There's evidence that when people stick with wrong facts, it isn't just stubbornness -- but actually some sort of brain glitch. Why is it so difficult to change people's minds?

Stephan Lewandowsky: It's not an easy task to update people's memories. That's a very clear result that even happens with completely innocuous items. It's a fundamental problem for our cognitive apparatus to update what's in our head.

What people have suggested -- and what I think is going on -- is that what people remember is the information, and then they attach a tag, "Oh no it's not." And the problem is that often this tag can be forgotten. So you remember the misinformation, but not the fact that it's false.

Now, one of the ways to get around that is to tell people not just that something is false, but tell them what's true. Alternative information makes it much easier to update your memory.

"YOU REMEMBER THE MISINFORMATION, BUT NOT THE FACT THAT IT'S FALSE"

That's a classic study where people are told there's a fire in a warehouse, and we found oil paints or flammable materials in the wiring cabinet. Then, later on, it will say, by the way, the wiring cabinet was empty. Now, if that's all you do, people will still think that there was oil paint in the wiring cabinet. Just simply saying something isn't true doesn't do the trick.

But instead, if you say the wiring cabinet was empty, and we found some petrol-soaked rags [elsewhere] at the scene, then people forget about the wiring cabinet because they have an alternative explanation for the fire. You need an alternativeto let people let go of the initial information.

Locke: What's the biggest thing people do wrong when trying to change other people's minds?

Lewandowsky: The moment you get into situations that are emotionally charged, that are political, that are things that affect people's fundamental beliefs -- then you've got a serious problem. Because what might happen is that they're going to dig in their heels and become more convinced of the information that is actually false. There are so-called backfire effects that can occur, and then the initial belief becomes more entrenched.

Locke: How can people prevent these backfire effects on political issues?

Lewandowsky: It's very difficult. A lot of this stuff is about cultural identity and people's worldviews. And you've got to take that into account and gently nudge people out of their beliefs. But it's a difficult process.

One [solution] is that if you give people an opportunity to self-affirm their beliefs ahead of time. Let's talk about weapons of mass destruction in Iraq. They didn't exist, right? After Iraq was invaded, they didn't show up. And yet I think to this date about 30 percent of the public believes in the existence of weapons of mass destruction, and that's sharply along partisan lines. If you get Republicans into the laboratory, and you say hey, there weren't any weapons of mass destruction, that may strengthen their incorrect belief. We've done exactly that study.

"YOU GET A LIBERAL TO TALK TO LIBERALS AND A CONSERVATIVE TO TALK TO CONSERVATIVES"

There's some evidence that you can avoid that if you ask people to tell us [about] an occasion when you felt really good about your fundamental beliefs in free enterprise (or whatever is important to the person in question). Then they become more receptive to a corrective message. And the reason is that it's less threatening in that context. Basically, I make myself feel good about the way I view the world, and then I can handle that because it's not threatening my basic worldview.

The other is you can have a messenger who is consummate with your beliefs. You get a liberal to talk to liberals and a conservative to talk to conservatives.

Locke: Have psychologists completely thrown out the information-deficit model -- the idea that you can change people's understanding by giving them the correct information?

Lewandowsky: It's a nuanced issue. A couple of years ago, people basically said the information-deficit model is dead -- it's all basically about culture. Now I think that's an oversimplification. It's a combination of two factors. Culture is extremely important. But it's also true that in some circumstances providing people with information is beneficial. That is, more information does enable people to sort out what's going on.

"SUPERFICIALLY JUST THROWING INFORMATION AT PEOPLE PROBABLY WILL MAKE THEM TUNE OUT"

Now, the trick appears to be that you've got to get people the opportunity to deal with information in great depth. If you have a situation like a classroom where people are forced to sit down and pay attention, that's when more information is helpful. There's a lot of evidence of this in educational psychology.

Now the problem is in a sort of casual situation, people listening to the radio or having a superficial conversation -- that's where the information deficit model doesn't apply. And superficially just throwing information at people probably will make them tune out. So you've got to be careful when you're talking about public discourse, TV, radio, media.

Locke: Let's say I'm going home for the holidays and have an uncle who doesn't believe in climate change. How can I change his mind?

Lewandowsky: It's difficult. There's a couple of things I can suggest. The first thing is to make people affirm their beliefs. Affirm that they're not idiots, that they're not dumb, that they're not crazy -- that they don't feel attacked. And then try to present the information in a way that's less conflicting with [their] worldview.

One of the problems I've been working with is people's attitudes toward climate change. For a lot of people, the moment they hear the words "climate change," they just shut down. But there are ways that you can get around that. For example, it's been shown that if you show the health consequences of climate change or if you can have market-based solutions to the problem, that does not challenge their worldview too much.

"AFFIRM THAT THEY'RE NOT IDIOTS ... [SO] THAT THEY DON'T FEEL ATTACKED"

If you tell people that there is an overwhelming scientific consensus that 97 out of 100 climate scientists agree on the basic notion of global warming, it seems that is a gateway belief that enables people to recognize the importance of the issue.

More often than not, that is effective with people who are ideologically disposed to reject global warming as a fact. In general, people are very sensitive to what they perceive to be the majority opinion around them.

Locke: If you throw too much information at people, are they more likely to reject your stance?

Lewandowsky: That's quite nuanced, and it depends on how much time people are willing to invest in processing the information. If people sit down with the intention of listening and trying to undo the problem, then we have no evidence for an overkill backfire effect.

However, there's plenty of evidence that in a casual context -- turning on the TV or whatever -- you can dilute the message by putting too much information in it. This whole information-overload issue is more critical in a more casual context. And that's always important.

Most of the research on misinformation has mimicked casual situations. People just sit there and read something like a newspaper article, and that's when you get backfire effects and people are very susceptible to misinformation.

Locke: What about the "familiarity effect," in which just mentioning the wrong information could make it stick even harder?

Lewandowsky: As recently as two or three years ago, I would have assumed that it exists. Now, it's beginning to look like that's not terribly robust. We've had a hard time trying to reproduce it. It sometimes occurs and sometimes doesn't. I'm inclined to think it will turn out to be quite infrequent.

Locke: What's your favorite experiment that shows the difficulty of debunking?

Lewandowsky: The one study I like a lot is the one I did about the Iraq war that was published in 2005. And what we did there was to look at people's processing of information related to the Iraq war and the weapons of mass destruction. We ran the study in three different countries: in the US, in Germany and in Australia -- at the same time.

And what we found is that Americans who knew something was false continued to believe in it, which makes no sense. We said, here's this piece of information and asked them if they knew it was retracted. And a minute later, we asked them whether they believed the information. And they continued to believe it. The Germans and Australians did not.

Now, at first glance, that makes it sound as though there's something weird about Americans compared to the other two nationalities. But what's really interesting is that's not the case at all. What drove this effect was the skepticism [of the reasons why the war was being fought in the first place]. It turns out that when we asked people if they thought the war at the time was fought over weapons of mass destruction, that item could predict whether people would continue to believe things that are false.

When you control for skepticism, all those differences between Americans and Germans and Australians disappear. There was an underlying cognitive variable that explains it. It just so happened that there were far more skeptics in Germany and Australia at the time.

Locke: How has psychology's understanding of debunking shifted since you first started studying it?

Lewandowsky: Over the past 10 years or so that I've been doing this, the role of cultural worldviews and people's identification with their own culture has been realized more and more. And equally, we know that skepticism is extremely important. People who are skeptical about the motives of someone telling us something -- that's very important and fairly new.

Another thing that's emerged more and more over time is the existence of backfire effects that if you tell people one thing, they'll believe the opposite. That finding seems to be pretty strong.

Locke: Have you seen people changing their messages in response to this new research?

Lewandowsky: The Debunking Handbook-- that's been downloaded at least half a million times. So that message is getting out, I think. I've seen a lot of reference to that handbook, and I think some people in the media are now aware of how difficult it is to remove information from public discourse.

I'm vaguely optimistic that this research is having an impact. And certainly when it comes to government and large organizations, I think they're beginning to be fairly savvy in what they say and how they do it, in part because of the research.

Locke: Is there anything else important that people should know?

Lewandowsky: One thing that I would point out is that it's very important for people to be skeptical and anticipate that people will be misleading to the public. Some of the misinformation that's out there is not accidental. I think there's quite a bit that's put into the public discourse in order to have a political effect. It's supposed to be wrong, but effective.

What our research shows is that if people are aware of the possibility that they might be misled ahead of time, then they're much better at recognizing corrections later on.

This interview has been lightly edited and condensed for length and clarity.

Further reading

How politics makes us stupid

всего слов - 2075