Skip to main content

A User's Guide to Rational Thinking

A User's Guide to Rational Thinking

by Евгений Волков -
Number of replies: 0

A User's Guide to Rational Thinking

Cut through flawed assumptions and false beliefs — including your own — with these strategies.

By Christie Aschwanden|Thursday, May 28, 2015

RELATED TAGS: PSYCHOLOGY

http://discovermagazine.com/2015/july-aug/16-user-guide-rational-thinking

rational-thought

Pat Kinsella

 

In the digital age, information is more plentiful than ever, but parsing truth from the abundance of competing claims can be daunting. Whether the subject is Ebola, vaccines or climate change, speculation and conspiracy theories compete with science for the public’s trust. Our guide to rational thinking is here to help. In the following pages, you’ll learn tools to identify the hallmarks of irrational thinking, evaluate evidence, recognize your own biases and develop strategies to transform shouting matches into meaningful discussions.

The Irrationalist in You

We’re programmed for irrational thought. 

Irrational thinking stems from cognitive biases that strike us all. “People don’t think like scientists; they think like lawyers. They hold the belief they want to believe and then they recruit anything they can to support it,” says Peter Ditto, a psychologist who studies judgment and decision-making at the University of California, Irvine. Motivated reasoning — our tendency to filter facts to support our pre-existing belief systems — is the standard way we process information, Ditto says. “We almost never think about things without some preference in mind or some emotional inclination to want one thing or another. That’s the norm.”

If you think you’re immune, you’re not alone. We’re very good at detecting motivated reasoning and biases in other people, Ditto says, but terrible at seeing it in ourselves. Spend a few minutes in honest reflection, and chances are you will find a few examples from your own life. Whether we’re telling ourselves that we’re better-than-average drivers, despite those traffic tickets, or insisting we’ll get through a 40-hour to-do list in a single day, we’re all prone to demonstrably false beliefs.

Much of our thinking on contentious issues is influenced by our pre-established social or cultural groups, says Dan Kahan, a law professor and science communication researcher at Yale Law School. Kahan studies cultural cognition — the idea that the way people process information is heavily determined by their deep-seated values and cultural identities. We don’t have time to evaluate every piece of evidence on every issue, so we look to people we trust in our in-groups to help us make judgments, Kahan says. Once a certain idea or stance becomes associated with a group we belong to (part of what Kahan calls a cultural identity), we become more inclined to adopt that position; it’s a way to show that we belong.

 
self-reflection
Pat Kinsella
If you consider yourself an environmentalist, for instance, you’re primed to adopt the view that hydraulic fracturing, or “fracking” — the controversial method of oil and gas extraction that involves cracking rock with pressurized fluids — poses a threat to the environment and human health. On the other hand, if you’re a conservative, you’re more apt to believe that fracking is harmless, since this is the stance taken by others in that group.

Being science-literate won’t protect you from such biases, Kahan says. His research has found that people who score high on measures of science comprehension tend to be more polarized than others on contentious issues. In one such study, Kahan and his research team surveyed a diverse sample of about 1,500 American adults regarding their political views. The team asked them to do a calculation designed to test their ability to slow down and do the math, rather than taking gut-reaction shortcuts that can lead to the wrong answer. The researchers presented the same math problem framed two different ways: as a nonpolitical question, and as a question looking into a politically charged issue, such as gun control. They found that the people who scored the best on the nonpolitical math problem fared the worst when the same problem was presented as a politically charged issue. The better your knowledge of science and the stronger your ability to understand numbers and make sense of data, the more adept you are at fitting the evidence to the position held by your group identity, Kahan says.

Is it possible to overcome these internal biases that sidetrack our thinking? The Center for Applied Rationality (CFAR) thinks so. This nonprofit group, based in Berkeley, Calif., holds workshops and seminars aimed at helping people develop habits of thought that break through biases.

The first step toward overcoming bias is to recognize and accept your fallibility, says Julia Galef, president and co-founder of CFAR. “We tend to feel bad about ourselves when we notice we’ve been wrong,” she says, but if you punish yourself, you create a disincentive for searching for truth. “We try to encourage people to congratulate themselves when they notice a flaw in their belief or realize that the argument someone else is making has some basis,” Galef says.

Another trick Galef recommends is the flip — turn your belief around. Ask yourself, “What are some reasons I might be wrong?” This strategy forces you to turn your attention to contrary evidence, which you might be motivated to overlook if you simply listed reasons for your views. Consider what it would look like for you to be wrong on this issue. Is any of the evidence compatible with this opposite view? Would you be inclined to believe this opposite argument if it were being promoted by someone from your own political party or social group? The answers can help you determine the strength of your position, Galef says, and whether it’s time to reconsider it
 

A Field Guide to Irrational Arguments

Scientific explanations are based on evidence and subject to change when new facts come to light. Irrational ones rely on assumptions and involve only the facts that support a chosen side. Here are five hallmarks of irrational arguments.

The science is nitpicked to fan doubt: Rather than considering the totality of the evidence, unscientific arguments cherry-pick data, mischaracterize research methods or results, or even make outright false claims. For instance, people who insist that vaccines cause autism may point to the known dangers of mercury as evidence, even though mercury is no longer a component of most vaccines, and studies have found no link between vaccines and autism. When a proponent’s explanation of the research contradicts the scientists, that’s a good sign that they’re peddling this type of false doubt.

The science is rejected based on implications not data: Instead of taking issue with the evidence itself, these types of arguments focus on the perceived implications, says Josh Rosenau, programs and policy director at the National Center for Science Education. “People will say, ‘Well, if evolution is true, then we don’t have souls, or we should all behave like animals.’ ” Never mind that the science doesn’t actually say anything about how people should behave. If the science can be implied to repudiate beliefs that people hold dear, it creates a huge incentive to engage in motivated reasoning, lest one’s world view comes crashing down.

Scientists’ motives and reasons are attacked: Critics often turn to personal attacks on scientists to cast doubt on their findings. Instead of criticizing the science itself, these lines of argument suppose that scientists have rigged their research to support the scientific consensus. Paul Offit, director of the Vaccine Education Center at the Children’s Hospital of Philadelphia, co-invented a life-saving rotavirus vaccine. Anti-vaccine crusaders seized on his association with it to imply that his advocacy stems from his financial interest in vaccines and ties to pharmaceutical companies. Offit has concluded that some of these people cannot be convinced. “It doesn’t matter how much data you show that person. If in their hearts they’re conspiracy theorists, you can’t convince them.”

Legitimate disagreements among scientists are amplified to dismiss the science: Evolution is one of the foundations of modern biology, but biologists are still discovering details of how evolution works. When geneticists offer contrary ideas about how speciation occurs, they’re debating the nuts and bolts of how evolution works, not arguing over whether it happens. Yet people fighting the teaching of evolution in schools may seize on legitimate scientific disputes as reasons to dismiss the scientific theory altogether, Rosenau says. When they present gadflies or scientists whose views are out of step with the majority of the field as the most trustworthy experts on an issue, that’s another red flag.

Appeals are made in the name of “fairness”: People touting this argument say, “We should just teach kids both sides because there are exactly two sides, in equal proportions,” even though there aren’t, Rosenau says. In most cases, this appeal is invoked to give false equivalence to a concept like intelligent design, which lacks evidence. If not counteracted, this approach can lend legitimacy to debates without scientific merit, Rosenau says.

 
6 Strategies for Conversing With Someone Who Has Irrational Ideas

 

When you encounter, say, some neighbors who refuse to vaccinate their children because of long-debunked fears of autism and mercury poisoning, it’s tempting to throw facts at them. But — as you know if you’ve ever tried this approach — bombarding people with evidence is doomed to fail. If you want any chance of engaging in a meaningful conversation, you’ll need better tactics. Here are six worth trying. We can’t promise they’ll work, but they’ll give you a fighting chance. 

Be a good listener and make a connection: As much as we’d like to think otherwise, most human judgments aren’t based on reason, but on emotion, says Ditto, the UC Irvine psychologist. Aim to forge a personal connection that makes the other person inclined to see you as “one of us.” Research by Yale’s Kahan has shown that people tend to adopt beliefs associated with their cultural groups. So look for common ground.

That means listening with respect, says Randy Olson, a scientist-turned-filmmaker and author of Don’t Be Such a Scientist. “Do not lecture. Nobody wants to hear that,” he says. Instead of throwing out a bunch of facts, ask questions. Show that you’re open to what the other person has to say. “Don’t rise above them; approach them at their level,” Olson says. The moment you create a divide (by implying that you’re smart and they’re not, for instance) you’ve lost the debate. Ultimately, it makes no difference how much evidence you’ve got. If you want your message to register, you have to speak it in a voice that’s trusted and likable, Olson says.

Figure out where they’re coming from and devise a frame that speaks to that: When people cling to irrational beliefs, it’s often because they’re somehow tied to their identity or social group. Whenever possible, present your argument in a way that fits, rather than challenges, the other person’s self-identity, says Julia Galef, co-founder of CFAR. For example, imagine you are trying to convince a friend who thinks of herself as bold and decisive that it’s OK to change her mind about an issue on which she’d taken a public stand.

One way to do this, Galef says, would be to frame an about-face as a gutsy and strong move.

Usually it’s not an aversion to science that motivates people to tout unscientific ideas, but some underlying cultural, social or personal issue, says Rosenau, of the National Center for Science Education. For instance, he says that many evangelicals he’s encountered see evolution as a repudiation of their religious beliefs. As long as they view it that way, they can’t endorse evolution without giving up their identity — and they’re unlikely to do that, no matter how compelling your facts, Rosenau says. The solution? Find a way to talk about evolution that doesn’t force them to abandon their group identity or belief system. “I might say, ‘Did you know that [National Institutes of Health Director] Francis Collins is an evangelical Christian?’ Then we might have a real conversation and talk about what it means to be a Christian who accepts that evolution is true.”

Affirm their self-worth before knocking down their erroneous beliefs: When your facts challenge people’s self-identities, their immediate impulse will be to reject them — that’s human nature, says Brendan Nyhan, a political scientist at Dartmouth College. When the thought of giving up a tightly held belief feels like a threat to our identity or world view, we’re prone to reject it out of hand.

One way to circumvent this problem is to make the person feel positive about themselves before presenting evidence that might topple their self-image. In one study, Nyhan and his colleagues had volunteers participate in an exercise designed to bolster their feelings of self-worth, such as remembering a time they felt good about themselves or recalling a value that was important to them, before presenting them with information that contradicted their beliefs about political events. The results showed that the self-affirming drills increased participants’ willingness to accept uncomfortable facts.

In real life, this might look more like an exchange that happened between my husband and me while we were backpacking. Coming to a fork in the trail, I insisted that we needed to go one way, while Dave was sure the other way was correct. It turned out that he was right, and I knew he was right, but I didn’t like what that said about me — that I have a poor sense of direction. This notion contradicts the vision I have of myself as a competent person. But when Dave laughed about it, and told me how funny it was that a smart person like me could get lost, I was suddenly able to accept his facts because they no longer challenged my beliefs about myself. By telling me that I can be a smart person and also get lost, he gave me a way to accept his directions and still feel good about myself.

Focus on the facts, not the misconceptions: When trying to counteract a myth, a natural response is to present and then debunk it. But tread carefully, says Nyhan. Studies show that repeating a misconception in order to disprove it often ends up reinforcing the erroneous idea in people’s minds.

In one study, volunteers viewed a pamphlet debunking myths about flu vaccines. Immediately afterward, people correctly sorted myths from facts, but just a half-hour later, they performed worse on this sorting task than they had before reading the flier. Reading the myths connected them to flu shots in participants’ minds, Nyhan says. People remembered reading those things about the flu shot, but over time they forgot which were true and which were false.

Instead of reiterating myths, Nyhan advises finding a simple, truthful message to present. If you overwhelm the person with a long list of complex explanations, you could invoke the so-called overkill backfire effect that drives your target to an explanation that’s more appealing.

“A simple myth is more cognitively attractive than an overcomplicated correction,” write researchers John Cook and Stephan Lewandowsky in The Debunking Handbook.

Ask the person to explain what they know: People who feel sure of their position set a high bar for contrary evidence, Ditto says, but often such confidence stems from a misperception that they know more than they actually do, a phenomenon researchers call the illusion of explanatory depth. Break this illusion, and the person may become more open to your position, Ditto says.

A study published in the journal Psychological Science in 2013 found that when people were asked to explain the details of how a political policy they supported would work, their beliefs on the issue became more moderate. Asking people to explain what’s behind their beliefs seems to make them scrutinize what they know, which in turn can force them to recognize the gaps in their knowledge. As a result, they become less sure of their position and possibly more open to what you have to say.

I recently tried this approach with an acquaintance who expressed concern that vaccines would harm her baby. What, exactly, was she worried might happen? Halfway through her attempt at an explanation, she admitted that she wasn’t really sure how immunizations would hurt him, but it scared her to give such a young child so many shots at once. She didn’t change her mind then and there, but she did agree to read some information I gave her to ease her fears.

Engage in person, not in writing: It’s no secret that people can behave poorly online. When you’re having a discussion in the abstract, it’s easy to set people off without being conscious of it, since you miss the body language and other social cues that would normally inform your behavior, says Chris Mooney, co-author of Unscientific America. “Once the emotions are working, responses are hot rather than cold, and pretty soon everybody’s circling wagons,” Mooney says. It’s more difficult to spiral into mindless rants and name-calling when you’re engaging someone face-to-face than when you’re arguing with their avatar. If you want to have a real debate, Mooney says, “go have a beer. Don’t argue on Facebook.”

  •  
    Avatar

    I loved the article, it resonates with what I already believe and spent a good amount of time thinking about (and actually I intend to write a book on bias).

    A few comments. "People don’t think like scientists; they think like lawyers" - Actually scientists think like lawyers too. You can find areas where scientists do better than the average Joe but you can also find areas where scientists are more biased than a mother who is convinced that her son is the best and should be on the team. As it says later in the article "Being science-literate won’t protect you from such biases." For an example of laymen doing better than scientists here's a quote from npr.org "But they didn't: 83 percent of the radiologists missed it, Drew says. This wasn't because the eyes of the radiologists didn't happen to fall on the large, angry gorilla. Instead, the problem was in the way their brains had framed what they were doing. They were looking for cancer nodules, not gorillas. “They look right at it, but because they're not looking for a gorilla, they don't see that it's a gorilla,” Drew says. In other words, what we're thinking about — what we're focused on — filters the world around us so aggressively that it literally shapes what we see."

    Even the term "confirmation bias" is mainly applicable to scientists.

    "Scientists’ motives and reasons are attacked" - If you can criticize the science then yes, don't criticize the scientists. But when studies are not easily replicated, one can only trust the scientist behind the science or question motives and reasons but more importantly "bias" (because they do have bias, right?). This is especially relevant in the light of recent scandals regarding ethics, fraud and retractions in science.

    "Scientists’ motives and reasons are attacked" - If you can criticise the science then yes, don't criticise the scientists. But when studies are not easily replicated, one can only trust the scientist behind the science or question motives and resons but more importantly "bias" (because they do have bias, right?). This is especially relevant in the light of recent scandals regarding ethics, fraul and retractions in science.

    "The science is rejected based on implications not data" - I agree but there is one exception though. If the implication of science completely undecuts science then it's a good time to reject the science. For example, if the implication of materialist science is that there is no free will, we are programmed by genes and determined by environment to act and think the way we do; that objectively there is nothing better than anything else (which is pretty easy to prove it's a necessary conclusion of materialism) then everything, including that materialist science is pointless. Such an implication should cause one to reject the science behind it (and as a matter of fact, no person can accept such a conclusion).

    "Appeals are made in the name of “fairness”" - This, as well as a few other points, was obviously framed against creationists and ID proponents. But ignoring that, I don't necessarily agree or disagree with it. The question what's more important, truth or fairness is not so easily answered and it depends on a lot of factors. Politicians, and being politically correct, on one hand put a lot of emphasis on "tolerance."

    "Ask yourself, “What are some reasons I might be wrong?”" - Excellent question! I wonder if the author of the article and the studies practice what they are preaching and can provide in a comment here some reasons why macro-evolution and materialism might be wrong. Since we are talking about science I'll give you a hint why, bias-wise, this may be the case. Not only that science is very biased (which is why it requires "scientific revolutions"; physicist Viktor Komarov has a whole excellent book on it, not translated in English unfortunately) but among science branches one cannot find more bias than when it comes to evolution. There is a very good reason why this is so. As arch-evolutionist and atheist Michael Ruse says, evolutionism is like a religion as it answers people's big questions: who am I, why am I here, etc. He gives an illustration saying that it's like someone at the bar getting into a fight becuase one said something about his mother. That's why there are so many militant evolutionists and atheists. Unlike other areas of science, evolution is not only about some knowledge. Evolution is a way of life - it determines how one thinks about life and its meaning - some of the things that one holds most dear (just like religion). Evolutionism is unavoidably emotionally charged. Just stop and think how you feel right now for example. As the article ends “Once the emotions are working, responses are hot rather than cold, and pretty soon everybody’s circling wagons”. This emotional charge makes it much more likely that blinding bias will manifest in evolutionism than in other areas of science. Not only it's likely but I have a lot of examples of such blindness (such as, not realizing obviouis contradictions) in evolutionists.

    I would add one more criteria to determine bias. When one claims "the others view just doesn't make sense" - this just confirms one's bias. No, any view makes sense if you give it the "right" assumptions. It does make sense to the one that holds it. If it doesn't make sense to you it's only because you are unable to, temporarily, set aside your set of assumptions and take on your opponent's assumptions in order to properly asses your opponent's view. This is the very point when your assumptions become bias. This is a crucial step in rational thinking that the article should have included.

  •  
  •  
    Avatar

    Lots of good stuff here. But lots that's not.

    One of the most important bits of quality information here is that they way people reason in an exchange - by accepting, defending, or exploring positions, by being open to admitting mistakes, or instead digging in, etc. - is often based not on the factual content of the dispute, but on how they think the outcome will affect their self image, such as the honorability of groups they affiliate with. Importantly, if people feel their honor is being threatened, they'll tend to dig in and defend.

    With that said, some of the advice isn't so good, and much of what is reported as fact isn't. For example:

    '“Don’t rise above them; approach them at their level,” Olson says. The moment you create a divide (by implying that you’re smart and they’re not, for instance) you’ve lost the debate.'

    This is just false. If it were true we wouldn't feel cowed and heavily persuaded by a doctor or a lawyer, we wouldn't allow ourselves to be corrected by teachers, and more. The counterexamples are too many to mention. The point of establishing a difference of expertise (being smarter regarding some issue) is often essential. What is true in the article's advice is that, if you really do have expertise and the other person doesn't, then any time your disagreement becomes framed as a war of honor, then the reasoning process will tend to get short-circuited - by stimulating this war framing you'll stimulate defensive reactions. On the other hand, if you stimulate a path-seeking frame - one that doesn't pit the two of you against each other - and present yourself as a well-heeled denizen of the territory, not a newcomer or casual visitor, but an informed guide who knows useful information the other person is unaware of, then you are less likely to stimulate defense mechanisms, and you'll emphasize non-antagonistic reasons for trust instead.

    Another example:

    'Usually it’s not an aversion to science that motivates people to tout unscientific ideas, but some underlying cultural, social or personal issue, says Rosenau, of the National Center for Science Education.'

    This is an appeal to authority, and just an opinion of an authority. It is not an evidence-warranted opinion, as the article implies. No studies or serious empirical attempt has been made to substantiate this claim (specifically, that "usually" some such personal issue is behind the resistance). It is instead, Rosnau's speculation about the underlying causes. As a corrective, any cognitive scientist (and any Bayesian) will also point out that most people assess newly encountered claims based on what they think they already know. If the claims are inconsistent with what they think they know, they'll tend not to accept such claims, and will tend to retain that judgment until a kind of tipping point requiring a revision of their past belief set is reached. This basis is also quite frequent, but has nothing to do with the kinds of motivated reasoning depicted by Rosnau as the main culprit. Similarly, all sorts of heuristically useful pattern-recognition-like short cuts are frequently in play, few of which have anything to do with the motivated-reasoning-style causes of resistance that Rosnau claims are "usually" the problem. And ironically, assuming that motivated reasoning is the basis of another person's resistance is a great way to encourage it, turning the lesson of this article into a self-fulfilling prophecy, and a self-defeating one at that.

  •  
  •  
    Avatar

    I just read this article . Really bought the magazines to read the Powering the Future article. Yeah I know it's a long time ago. I sometimes flip through old magazines for the fun of it. But it is still a very relevant topic of course. It is a very useful article and the author has given some very applicable tips for critical thinking about issues. But it seems like the '5 irrational argument identifiers' and the '6 strategies for conversing ' are skewed toward a consensus view of science. Is science always free from subjectivity and bias? What do you do when the generally accepted science is in fact wrong? Using one of the tips, for example, can the writer explain how evolution factually works to cause transmutation? There is a reason why a significant percentage of the population is suspicious of science and scientists. Some common sense positions are denied by much of modern science and scientists do succumb to groupthink especially with evolutionary 'theory'. Minority positions in science have been vindicated in the past so scientists need to be a lot more humble.

  •  
  •  
    Avatar

    Very interesting article, I really enjoyed reading it!

  •  
    Avatar

    Very good article!!! Thank you very much. It will surely help me in my debates smile

  •  
  •  
    Avatar

    Very interesting article, I really enjoyed reading it!

  •  
     
    Avatar

    An alternative analysis might include a comparison of relativity between personal WANTS and NEEDS. This differential sometimes helps to define our behavior -- both rational and irrational -- that makes us tick. We are apt, for example, to engender expectations based more on wants than needs when realism is factored in the equation. At other times, these two are intertwined, even reversed in priority at least relative to the issue at hand. Human behavior is complex and individualistic with endless complicating and interesting competing interactions. It is what makes life special and wothwhile.

  •  
    Avatar

    The answer is that most issues are complicated and not black and white like most people want them to be.

  •  
    Avatar

    Another approach to engaging evangelicals in a discussion about evolution it to make the suggestion that God made the earth revolve much more slowly in the distant (4.5 billion years ago) past than it does today and that the 7 days of creation lasted for 2 billion years by today's clock.

  •  
    Avatar

    interesting but so much lesser than just an application of the Ten Transformers for self understanding

  • Avatar

    Excellent article but I have to object to a comment made by Peter Ditto: attorneys may argue a case with a slant, but, in order to do that effectively for their client, all evidence must be considered.

  •  
  • Avatar

    The point was that attorneys always start with a position, and then defend it to the end. You never have one attorney accept the overwhelming evidence of the other mid-case and then start arguing against their client.

    The attorneys don't start neutral and then seek the truth. They start biased and stick with their biases, leaving the jury to seek truth.

4928 words