Five Lessons from Julia Galef’s ‘The Scout Mindset’
Respect for reason has waxed and waned throughout history. Today, its tide is receding. University professors resign in frustration from what were once our bastions of rationality. Increasingly, the barbarians are not merely at the gates, but running the show in a vast swathe of humanities departments. After decades of decay in our academic training grounds, radical identitarianism and other irrationalities are spreading with accelerating speed, and we are woefully short of thinkers capable of fighting them.
Those few who are working to diagnose and cure our culture deserve our attention and appreciation. Among them are Stephen Hicks and Helen Pluckrose, for mucking around in the sewers of postmodernism and cataloging the causes of our predicament; Virginia Postrel, Johan Norberg, Matt Ridley, and Steven Pinker, for reminding us of the importance of dynamism, the open society, progress, and enlightenment; and more recently, social scientists and practitioners such as Adam Grant and Julia Galef for their tips and techniques on cultivating and preserving objectivity.
1. It’s easier than you think to fall into motivated reasoning
It’s a tale as old as time: A man thinks he’s being evenhanded in assessing something, only to later learn that bias had disfigured his evaluation. Some prior conviction concealed a blind spot and kept him from asking pertinent questions.
Alfred Dreyfus lost five years of his life to such a mistake. As the only high-ranking Jewish officer in the French military during the early 1890s, a particularly antisemitic period in France’s history, suspicion immediately fell on him when it was learned that a traitor had been leaking military secrets to the Germans. Witnesses came forward to report that they had heard Dreyfus praise the German Empire, that they’d seen him gambling, that he kept mistresses despite being married and a handwriting expert matched Dreyfus’s scrawl to a damning piece of evidence.
When a second expert contradicted the first, investigators found that the second had connections to Jewish bankers and decided he could not be trusted. Dreyfus was publicly shamed, stripped of his rank, and sent to a penal colony on the aptly named Devil’s Island for a life of solitary confinement. When the treasonous letters leaking secrets to the Germans resumed, Dreyfus’s prosecutors reasoned that a new traitor had appeared who happened to have similar handwriting.
This, Galef points out, was a classic case of “motivated reasoning” employed to defend pre-existing beliefs and conclusions. When the investigators found evidence that appeared to confirm Dreyfus’s guilt, they sought grounds for believing it. But when confronted by contrary evidence, they sought excuses to dismiss or discount it. “It might not look this way,” writes Galef, “but the officers who arrested Dreyfus had not set out to frame an innocent man. From their perspective, they were conducting an objective investigation of the evidence, and the evidence pointed to Dreyfus.”
The man who exonerated Dreyfus, Colonel Georges Picquart, approached the investigation with an entirely different modus operandi. “In contrast to directionally motivated reasoning,” Galef remarks, he was led by “accuracy motivated reasoning,” which “evaluates ideas through the lens of ‘Is it true?’” This focus enabled Picquart—a man of heroic intellectual honesty, who endured years of personal suffering in the process—to reverse Dreyfus’s conviction and obtain his freedom.
2. The solution is not mere knowledge, but mindset cultivated into habit
Galef is the host of the podcast “Rationally Speaking” and a co-founder of the Center for Applied Rationality, which holds workshops on rational thinking and avoiding cognitive biases. But after a few years teaching these workshops, Galef concluded that knowledge of common logical fallacies and biases, while helpful, is not enough. We need a whole new mindset.
Many people treat “reasoning as defensive combat,” the hallmark of what Galef calls “soldier mindset.” Its converse is “scout mindset.” The soldier seeks to defend his position at all costs, while the scout surveys and reports what he sees. As much as he might wish there were a bridge in a strategic spot, he knows that wishing cannot make it so. With time and practice, we can make the scout mindset a habit, just as Charles Darwin did.
In 1860, a year after On the Origin of Species was published, Darwin was an embattled man, mocked and attacked even by some who had once been his friends and mentors. Much of the flack was ad hominem, but he recognized a particularly potent objection in, of all places, the peacock’s tail. This heavy and ostentatious appendage could surely only help peacocks become lunch, so according to Darwin’s theory, it ought to have been weeded out in the scramble for survival. Splendid as the sight of it was, Darwin admitted that looking at it made him physically ill, a colorful rebuke to the mountains of evidence he’d collected over decades. But he had resolved that:
...whenever a published fact, a new observation or thought came across me, which was opposed to my general results, to make a memorandum of it without fail and at once; for I had found by experience that such facts and thoughts were far more apt to escape from the memory than favourable ones.
Darwin didn’t ignore or excuse the objection. He spent two years investigating it, discovering a new level of complexity with which to strengthen his theories of speciation. This was his identification of sexual selection; a trait disadvantageous to survival might still proliferate in a species if it provided a correspondingly greater reproductive advantage, enabling the animal to attract more mates.
Over decades, Darwin had made it a habit to leverage well-founded criticism in leveling up his own ideas, refining his theories—or updating them, in scout speak—instead of digging in his heels. Scout mindset became second nature.
3. When it’s important, conduct a thought experiment
Many thought experiments are suitable only for confounding freshman philosophy students. But Galef’s collection here is useful. Here are six she recommends:
The Double Standard Test: An obvious but underused test is to ask: “Am I judging other people’s behavior by a standard I wouldn’t apply to myself?” For instance, a Democrat commenting in 2009 on his party’s plan to abolish the filibuster, wrote: “I’m just imagining how I would have reacted if I’d have heard that a similar tactic was used by [Republican president George W. Bush] about a war budget or something of that nature. I wouldn’t like it one bit.” It’s a simple test, but many fail it.
The Selective Skeptic Test: Ask yourself if you’d find a certain type of evidence or rhetorical strategy persuasive were it used to support the opposite view. How credible is the reasoning or methodology? Are you holding it to the same standards you would if it supported a contrary conclusion? Galef offers a memorable example. After encountering a study that purported to show that the soldier mindset makes people more successful, she checked its methodology for flaws and found plenty. “Then, somewhat grudgingly,” she writes, “I did a thought experiment: What if this study had claimed that soldier mindset makes people unsuccessful in life?” Excellent, she probably would have thought, “I’ll have to find a place for this study in my book!” But this was “a wake-up call for me, a warning that I needed to be a little less credulous of evidence that happened to support my side.” She reconsidered the studies she planned to include in her book and decided to omit most of them.
The Outsider Test: If you’ve ever asked, “What would X do?” in a given set of circumstances, you’ve used the outsider test. “Imagine someone else stepped into your shoes—what do you expect they would do in your situation?” That someone could be a hero, friend, family member, or even a rival.
The Conformity Test: “If I find myself agreeing with someone else’s viewpoint, I do a conformity test,” writes Galef. “Imagine this person told me that they no longer held this view. Would I still hold it? Would I feel comfortable defending it to them?”
Status Quo Bias Test: Suppose you’re considering whether to change careers, living situations, romantic partners, or whatever. Now imagine you’ve already made the change; you’re in the new job, apartment, country, or relationship. If that were your new status quo, would you then actively choose your actual current situation? “If not, that’s a sign that your preference for your situation [supposing you prefer your current situation] is less about its particular merits and more about a preference for the status quo.”
The Ideological Turing Test: “The ideological Turing test, suggested by economist Bryan Caplan, is ... a way to determine if you really understand an ideology: Can you explain it as a believer would, convincingly enough that other people couldn’t tell the difference between you and a genuine believer?”
4. Enjoy the emotional rewards of scout mindset
A leitmotif of our age is that a clear-eyed reckoning of the facts is often too demoralizing; we must at least mildly distort our view of the world to stay sane. In their book, Mistakes Were Made (But Not by Me), psychologists Carol Tavris and Elliot Aronson explore self-justification, a type of motivated reasoning. Although they focus on the many downsides of self-justification, they ultimately conclude that we all need some of it because “Without it, we would prolong the awful pangs of embarrassment. We would torture ourselves with regret over the road not taken or how badly we navigated the road we did take.” Nobel-winning psychologist Daniel Kahneman likewise argues that motivated reasoning increases our resilience.
It’s true that we often dislike learning that we are wrong. But, Galef argues, we need not lie to ourselves to buoy our emotions. There are good reasons to get over that initial shock and embrace what Adam Grant calls “the joy of being wrong.” There is beauty in encountering well-founded opposition to our ideas. Each honest objection is an opportunity to go deeper into the nuances of the truth, to spot new causal relations, caveats, and complexities. Whatever the short-term consequences, the long-term value of accurate thinking is greater.
Galef identifies useful tools for overcoming the urge to dig in our heels when we might actually be in the wrong. For instance, it’s easier for a person to seriously consider that he might be wrong about something if he has a plan for what he’d do if he were suddenly certain he was wrong. “It’s striking how much the urge to conclude ‘That’s not true’ diminishes once you feel like you have a concrete plan for what you would do if the thing were true,” she writes. “Your ability to see clearly is precious, and you should be reluctant to sacrifice it in exchange for emotional comfort. The good news is that you don’t have to.”
5. Rethink identity
It was an essay by tech entrepreneur Paul Graham, titled “Keep Your Identity Small,” that course-corrected Galef out of a PhD program in economics and onto her current path. Graham argued that “people can never have a fruitful argument about something that’s part of their identity” because “by definition they’re partisan.” Further, “When people say a discussion has degenerated into a religious war, what they really mean is that it has started to be driven mostly by people’s identities,” not their reason. The implication, Graham argued, is clear: “If people can’t think clearly about anything that has become part of their identity, then all other things being equal, the best plan is to let as few things into your identity as possible.”
Indeed, putting more stock in one’s “side” than truth seems to be another hallmark of the times. Consider former Trump press secretary Kayleigh McEnany’s motto for her office: “Offense only.” In a
Here, Galef raises a deeper philosophical question that lies beyond the scope of The Scout Mindset. According to Galef and Graham, if something is personal—relating to the core of who we are—we can’t be objective about it. But what if we are personally invested in being scouts? This conundrum is leveraged by advocates of identity politics and other forms of collectivism to dismiss such things as scout mindset, objectivity, and rationality.
Recall, for instance, the chart released in 2020 by the National Museum of African American History & Culture, titled “Aspects and Assumptions of Whiteness and White Culture in the United States”:
“White dominant culture, or whiteness,” it tells us, “refers to the ways white people and their traditions, attitudes, and ways of life have been normalized over time and are now considered standard practice in the United States.” Among these “traditions,” the chart lists “Emphasis on Scientific Method: Objective, rational linear thinking; Cause and effect relationships; Quantitative emphasis.”
If practitioners and advocates of scout mindset or “objective, rational linear thinking” don’t want their views dismissed as mere prejudice or the cultural debris of their race or income level, they need answers to more fundamental questions about existence, human nature, and truth. For instance, should they buy into the supposed conflict between personal values and objectivity? “Most people ... think that abstract thinking must be ‘impersonal,’” wrote Ayn Rand:
—which means that ideas must hold no personal meaning, value or importance to the thinker. This notion rests on the premise that a personal interest is an agent of distortion. But “personal” does not mean “nonobjective”; it depends on the kind of person you are. If your thinking is determined by your emotions, then you will not be able to judge anything, personally or impersonally. But if you are the kind of person who knows that reality is not your enemy, that truth and knowledge are of crucial, personal, selfish importance to you and to your own life—then, the more passionately personal the thinking, the clearer and truer.
In other words, “truth isn’t in conflict with your other goals,” as Galef puts it, and the more tightly you’ve embraced that principle, the more you’ll feel the personal importance of accuracy. Darwin’s thinking encapsulated his life’s work and certainly had great personal importance to him. That’s what drove him to get things right. Likewise, it’s what drove Galef to spend five years writing her book, and rewriting much of it after throwing out supporting studies that failed to meet her standards for methodological quality. On the other hand, when a person has low personal stake in accuracy, he’s unwilling to expend the effort required to attain it (a key factor in the phenomenon that Bryan Caplan dubbed “rational irrationality”).
More deeply, is truth relative to one’s race, gender, sexual orientation, and waist size? Or, maybe “truth” is merely a “narrative” we use to gain “power.” Galef offers powerful ideas for keeping one’s thinking grounded in reality. They rest on common-sense assumptions about the nature of knowledge and our means of gaining it. But, not only is common sense not common, it has long been under attack. Alone, The Scout Mindset could do wonders for people’s lives. But, placed upon a philosophical foundation—one that defends and celebrates objectivity—it might be powerful enough to rehabilitate our entire culture.