The Illusion of Explanatory Depth
If you asked one hundred people on the street if they understand how a refrigerator works, most would respond, yes, they do. But ask them to then produce a detailed, step-by-step explanation of how exactly a refrigerator works and you would likely hear silence or stammering. This powerful but inaccurate feeling of knowing is what Leonid Rozenblit and Frank Keil in 2002 termed, the illusion of explanatory depth (IOED), stating, “Most people feel they understand the world with far greater detail, coherence, and depth than they really do.”
Rozenblit and Keil initially demonstrated the IOED through multi-phase studies. In a first phase, they asked participants to rate how well they understood artifacts such as a sewing machine, crossbow, or cell phone. In a second phase, they asked participants to write a detailed explanation of how each artifact works, and afterwards asked them re-rate how well they understand each one. Study after study showed that ratings of self-knowledge dropped dramatically from phase one to phase two, after participants were faced with their inability to explain how the artifact in question operates. Of course, the IOED extends well beyond artifacts, to how we think about scientific fields, mental illnesses, economic markets and virtually anything we are capable of (mis)understanding.
At present, the IOED is profoundly pervasive given that we have infinite access to information, but consume information in a largely superficial fashion. A 2014 survey found that approximately six in ten Americans read news headlines and nothing more. Major geopolitical issues from civil wars in the Middle East to the latest climate change research advances are distilled into tweets, viral videos, memes, “explainer” websites, soundbites on comedy news shows, and daily e-newsletters that get inadvertently re-routed to the spam folder. We consume knowledge widely, but not deeply.
Understanding the IOED allows us to combat political extremism. In 2013, Philip Fernbach and colleagues demonstrated that the IOED underlies people’s policy positions on issues like single-payer health care, a national flat tax, and a cap-and-trade system for carbon emissions. As in Rozenbilt and Keil’s studies, Fernbach and colleagues first asked people to rate how well they understood these issues, and then asked them to explain how each issue works and subsequently re-rate their understanding of each issue. In addition, participants rated the extremity of their attitudes on these issues both before and after offering an explanation. Both self-reported understanding of the issue and attitude extremity dropped significantly after explaining the issue—people who strongly supported or opposed an issue became more moderate. What is more, reduced extremity also reduced willingness to donate money to a group advocating for the issue. These studies suggest the IOED is a powerful tool for cooling off heated political disagreements.
The IOED provides us much-needed humility. In any domain of knowledge, often the most ignorant are the most overconfident in their understanding of that domain. Justin Kruger and David Dunning famously showed that the lowest performers on tests of logical reasoning, grammar, and humor are most likely to overestimate their test scores. Only through gaining expertise in a topic do people recognize its complexity and calibrate their confidence accordingly. Having to explain a phenomenon forces us to confront this complexity and realize our ignorance. At a time where political polarization, income inequality, and urban-rural separation have deeply fractured us over social and economic issues, recognizing our only modest understanding of these issues is a first step to bridging these divides.
Our Tribal Intelligence
Here’s the second in my series of sketchbook videos exploring the theme of “critical thinking and tribalism”.
In this episode I focus on the epistemic dimension of our tribal nature. I talk about how social survival strategies played an important role in the evolution of our distinctively human intellectual capacities, by increasing our storehouse of knowledge and skills via the development of CULTURE.
I also show how one of the consequences of our socially distributed knowledge is that we’re prone to OVERCONFIDENCE about our own personal knowledge.
Check out the references below to learn more about this perspective on human evolutionary biology.
“The Secret of Our Success”
Fans of human evolutionary biologist Joe Henrich’s work will recognize his influence on the story I tell in the video.
- Joseph Henrich – academic homepage at Harvard
- Henrich’s book: The Secret of Our Success: How Culture Is Driving Human Evolution, Domesticating Our Species, and Making Us Smarter.
- Link to a . Image link below take you there as well.
“The Knowledge Illusion” (or, The Illusion of Explanatory Depth”)
In their book The Knowledge Illusion: Why We Never Think Alone, cognitive scientists Steven Sloman and Phil Fernbach explore connections between the knowledge illusion and the evolution of human sociality, the division of cognitive labor, and other questions relating to the interaction between what I called “personal knowledge” in the video, and the broader knowledge that is stored in the community.
- The Knowledge Illusion (book link)
- The Illusion of Explanatory Depth (Edge article)
- Sloman Lab – research links
- Phillip Fernbach – links