ARE you intelligent — or rational? The question may sound redundant, but in recent years researchers have demonstrated just how distinct those two cognitive attributes actually are.
It all started in the early 1970s, when the psychologists Daniel Kahneman and Amos Tversky conducted an influential series of experiments showing that all of us, even highly intelligent people, are prone to irrationality. Across a wide range of scenarios, the experiments revealed, people tend to make decisions based on intuition rather than reason.
In one study, Professors Kahneman and Tversky had people read the following personality sketch for a woman named Linda: “Linda is 31 years old, single, outspoken and very bright. She majored in philosophy. As a student, she was deeply concerned with issues of discrimination and social justice, and also participated in antinuclear demonstrations.” Then they asked the subjects which was more probable: (A) Linda is a bank teller or (B) Linda is a bank teller and is active in the feminist movement. Eighty-five percent of the subjects chose B, even though logically speaking, A is more probable. (All feminist bank tellers are bank tellers, though some bank tellers may not be feminists.)
In the Linda problem, we fall prey to the conjunction fallacy — the belief that the co-occurrence of two events is more likely than the occurrence of one of the events. In other cases, we ignore information about the prevalence of events when judging their likelihood. We fail to consider alternative explanations. We evaluate evidence in a manner consistent with our prior beliefs. And so on. Humans, it seems, are fundamentally irrational.
But starting in the late 1990s, researchers began to add a significant wrinkle to that view. As the psychologist Keith Stanovich and others observed, even the Kahneman and Tversky data show that some people are highly rational. In other words, there are individual differences in rationality, even if we all face cognitive challenges in being rational. So who are these more rational people? Presumably, the more intelligent people, right?
Wrong. In a series of studies, Professor Stanovich and colleagues had large samples of subjects (usually several hundred) complete judgment tests like the Linda problem, as well as an I.Q. test. The major finding was that irrationality — or what Professor Stanovich called “dysrationalia” — correlates relatively weakly with I.Q. A person with a high I.Q. is about as likely to suffer from dysrationalia as a person with a low I.Q. In a 2008 study, Professor Stanovich and colleagues gave subjects the Linda problem and found that those with a high I.Q. were, if anything, more prone to the conjunction fallacy.
Based on this evidence, Professor Stanovich and colleagues have introduced the concept of the rationality quotient, or R.Q. If an I.Q. test measures something like raw intellectual horsepower (abstract reasoning and verbal ability), a test of R.Q. would measure the propensity for reflective thought — stepping back from your own thinking and correcting its faulty tendencies.
There is also now evidence that rationality, unlike intelligence, can be improved through training. In a pair of studies published last year in Policy Insights From the Behavioral and Brain Sciences, the psychologist Carey Morewedge and colleagues had subjects (more than 200 in each study) complete a test to assess their susceptibility to various decision-making biases. Then, some of the subjects watched a video about decision-making bias, while others played an interactive computer game designed to decrease bias via simulations of real-world decision making.
In the interactive games, following each simulation, a review gave the subjects instruction on specific decision-making biases and individualized feedback on their performance. Immediately after watching the video or receiving the computer training, and then again after two months, the subjects took a different version of the decision-making test.
Professor Morewedge and colleagues found that the computer training led to statistically large and enduring decreases in decision-making bias. In other words, the subjects were considerably less biased after training, even after two months. The decreases were larger for the subjects who received the computer training than for those who received the video training (though decreases were also sizable for the latter group). While there is scant evidence that any sort of “brain training” has any real-world impact on intelligence, it may well be possible to train people to be more rational in their decision making.
It is, of course, unrealistic to think that we will ever live in a world where everyone is completely rational. But by developing tests to identify the most rational among us, and by offering training programs to decrease irrationality in the rest of us, scientific researchers can nudge society in that direction.