I sometimes see other people make mistakes in their thinking. They misinterpret evidence, draw mistaken conclusions, ignore key facts. They do this entirely unknowingly. They are not aware of their mistakes until they've stepped on the rake of reality and it has hit them in the face. It does not take a large leap of imagination, however, to believe that if other people make unconscious mistakes, so do I.
Daniel Kahneman, who won the Nobel Prize in Economic Sciences for his work in psychology, has written Thinking, Fast and Slow, to alert readers to the challenge of thinking clearly. For example: "You see a person reading The New York Times on the New York subway. Which of the following is a better bet about the reading stranger?
"She has a PhD.
"She does not have a college degree."
Kahneman argues convincingly that we humans have two systems of thinking. One is quick, intuitive, and emotional; he calls this System 1. The other is slower, more deliberative, and more logical, System 2. Most of us have to use System 2 to compute the product of 17 times 24. Because System 1 is fast, it is prone to make mistakes. In the above example, everyone who intuitively guesses that the Times reader has a PhD is mistaken. When you (reflexively) consider how many people have PhDs and how many people ride the New York subways, you realize the better bet is that she does not have a degree.
Thinking, Fast and Slow is filled with examples like this to illustrate how the way our minds work can trip us. The book reports on dozens of experiments to test effects. Kahneman and his friend and collaborator Amos Tversky one rigged a wheel of fortune so that it would stop only at 10 and 65. They would spin it and ask unsuspecting subjects to write down the number at which it stopped. They then asked: "Is the percentage of African nations among UN members larger or smaller than the number you just wrote? What is your best guess of the percentage of African nations in the UN?"
A wheel of fortune has nothing to do with anything. The participants should have ignored it. But they didn't. "The average estimates of those who saw 10 and 65 were 25% and 45% respectively." Why? Because of an "anchoring effect," and Kahneman clearly explains it and provides more examples from daily life.
For a book filled with the results of psychological experiments and some fairly sophisticated statistical theory, it is remarkably clear. Kahneman suggests where we can and cannot trust our intuitions (pace Malcolm Gladwell) and how we can tap into the benefits of slow thinking. We can never avoid mental glitches, but we with work we can reduce them and, perhaps in situations where it really counts, take the time to let the lazy System 2 do its work.