Step 3: Look Inside the Mind

“People generally see what they look for, and hear what they listen for.”

–Harper Lee, To Kill a Mockingbird

The quick and easy part of the journey is done. It’s not hard to understand why viewpoint diversity is good for you, and you have already heard many times that we all need to be more humble and less judgmental. But why is it so difficult for most people to act on these insights? Why is it so hard to apply them during an argument with friends and family—to say nothing of conflicts with people who are more distant from us?

A little bit of psychology can explain why. It flows from the insight that the mind is divided into parts or systems that can work independently and that sometimes conflict. In the book Thinking Fast and Slow, Daniel Kahneman calls them “system 1” (fast, automatic, intuitive thinking) and “system 2” (slow, language-based reasoning). We’ll also refer to these systems using the metaphor that the mind is like a small rider (system 2) perched on the back of a very large elephant (system 1). The rider may think he is in control, but the elephant is much more powerful, and sometimes smarter too.

This is easy to see when you engage in a moral or political argument with someone else. You can see that the other person is emotionally committed to a certain conclusion (that’s the elephant choosing a side instantly), and you can see that the person’s arguments are mostly just post-hoc rationalization (that’s the rider, who is good with words). This is why such arguments are usually so frustrating—no matter how good your reasons, your opponent will not change her mind. And, of course, she thinks the same about you. Both of you are engaging in “motivated reasoning.” Both of you are searching for evidence to confirm what you already believe (as you’ll see in the videos below.)

It gets worse. We have an unfortunate tendency to be “Manichaeans,” which means that in moral and political disputes we see the world as a battle between good (our side) and evil (the other side). We fall prey to “naïve realism”—we think the moral truth is as obvious, as “real” as whether it is sunny or raining. Anyone who claims not to see the same moral facts as we do must be either stupid or dishonest—which justifies our hostility and rejection of them. And because everyone else on each side is just as Manichaean and just as judgmental, there is a great deal of social benefit to be had from “virtue signaling.” Much of what people say during a moral argument is not really intended to change the other person’s mind; rather, we’re trying to reassure our teammates that we are good team members.

The reading below the videos will explain how naïve realism works, and how to escape from it to understand the other side.

Our Picks
Recommended Videos:

 

  • The first video explores the two systems in the brain and difference between fast and slow thinking (4:40).
  • The second video describes the psychological phenomenon of “confirmation bias” and how it affects our thinking (1:34).

Recommended Essay:

  • 7 psychological concepts that explain the Trump era of politics (2017) by Brian Resnick. Resnick explores seven key psychological lessons that help make sense of the unusual nature of our current political environment. Resnick covers topics including motivated reasoning, tribalism, and moral foundations theory.
Essays
  • The Faults of Others, adapted from The Happiness Hypothesis (2006) by Jonathan Haidt. Weaving together insights from ancient wisdom and modern psychological research, moral psychologist Jonathan Haidt explores why we’re so good at identifying the faults of others but bad at identifying the faults within ourselves. Haidt explains that there are specific cognitive process that “predispose us to hypocrisy, self-righteousness, and moralistic conflict.” He concludes that “sometimes, by knowing the mind’s structures and strategies, we can step out of the ancient game of social manipulation, and enter into a game of our choosing.”
  • The certainty epidemic (2008) by Robert Burton. Neurologist Robert Burton explores the notion of “certainty” and explains that modern biology indicates that “Feeling correct or certain isn’t a deliberate conclusion or conscious choice.” Instead, “feeling certain” is “a mental sensation that happens to us.”
  • I’m Right! (For Some Reason) (2012) by Steven Sloman and Philip M. Fernbach. Professors Sloman and Fernback discuss psychological evidence that human beings suffer from the “illusion of explanatory depth.” They argue that “We have a problem in American politics: an illusion of knowledge that leads to extremism. We can start to fix it by acknowledging that we know a lot less than we think.”
  • WIRED will now predict your political views (you naïve thing) (2016) by Lee Ross and Thomas Gilovich. Psychologists Ross and Gilovich describe the concept of “naïve realism,” the mistaken sense that we see the world as it objectively is, rather than as a subjective interpretation based on our own experiences and perspectives.
  • I’m O.K., You’re Biased (2006) by Daniel Gilbert. Psychologist Daniel Gilbert presents research that suggests that decision-makers are often unaware of their vulnerability to bias. Gilbert explains that “The human brain knows many tricks that allow it to consider evidence, weigh facts and still reach precisely the conclusion it favors….By uncritically accepting evidence when it pleases us, and insisting on more when it doesn’t, we subtly tip the scales in our favor.”
  • Confirmation Bias (2010) by David McRaney. McRaney describes the psychological research on “confirmation bias,” the human tendency to seek out information that confirms the beliefs we already hold and avoid opinions or evidence that contradicts our beliefs.
  • Reason Seen More as Weapon Than Path to Truth (2011) by Patricia Cohen. The article describes Hugo Mercier and Dan Sperber’s  groundbreaking “argumentative theory of reasoning” which challenges the long-held belief that human beings evolved the capacity to reason in order to arrive at truth. Instead, the theory suggests that reason evolved for the social purpose of helping individuals win arguments. The theory helps account for the irrationalities and biases of the human mind.
  • Inside the Political Brain (2012) by Chris Mooney. The author traces research developments on the human mind that reveal the flaws in our reasoning abilities. Mooney discussed concepts such as “confirmation bias” and “motivated reasoning” that explain our tendency to notice facts that support our beliefs and overlook information that contradicts us. Similarly, research reveals that we are quick to see the mistakes of others, but we are blind to our own biases and lapses in rational judgment. Researchers argue that these “quirks” in reasoning suggest that our minds have evolved to provide excellent justifications for our pre-existing positions, rather than to identify objective truth.
  • Tribalism, Groupism, Globalism (2013) by E.O. Wilson. Biologist E.O. Wilson discusses our tribalistic nature by discussing social psychology research that “reveals how swiftly and decisively people divide into groups, and then discriminate in favor of the one to which they belong.”
Videos
  • Dan Ariely, Are we in control of our own decisions?, TED Talk (17:14). Acclaimed behavioral economist Dan Ariely presents a series of visual illusions and surprising experimental research to demonstrate that we are not as rational as we think.

  • Julia Galef, Why you think you’re right — even if you’re wrong, TED Talk (11:37). Writer Julia Galef presents a metaphor for two different types of mindsets that people can have when evaluating information. Galef explores how these differences in mindset can influence our conclusions.

Academic Articles

If you have ideas for how to use the Viewpoint Diversity Experience in your school or company, or if you want to suggest additional resources, email us at: viewpointdiversity@heterodoxacademy.org.