Summary: Thinking, Fast and Slow by Daniel Kahneman

Summary: Thinking, Fast and Slow by Daniel Kahneman

Front cover of Thinking, Fast and Slow by Daniel Kahneman

Quick summary

Thinking, Fast and Slow is a detailed look at how humans can act irrationally, how to understand where and when we are acting irrationally and some basic steps to overcome it.

The book is written by Daniel Kahneman, an Israli-American psychologist who has won the Nobel Prize for economics.

Thinking, Fast and Slow is not an easy read and is full of some dense theory. However, it is tackling some really complex ideas with mind-blowing implications and presents them as simply as they possibly can be presented without being insulting or diluting the meaning.

This is a very interesting read even if it is longer than it needs to be at almost 500 pages.

Extended summary

Thinking, Fast and Slow by Israeli-American psychologist Daniel Kahneman that was published in 2011.

Despite being a psychologist, Kahneman has won The Nobel Prize for economics. The prize was for his work on Prospect Theory with his long-time collaborator, Amos Tversky. The story of how the pair collaborated and the impact of their work in changing how we view the human mind’s ability for judgment/ decision making was covered in a Michael Lewis book – The Undoing Project.

Most of Thinking, Fast and Slow covers the work the pair did together but Kahneman also goes on to discuss some research he did on his own that looks at happiness.

The book is split into 5 sections covering: An overview of the two selves, heuristics and biases, overconfidence, choices, and then a further explanation of the two selves in conjunction with Kahneman’s work on happiness.

But the overarching theme running through the book is that of human irrationality. This is particularly interesting because Daniel Kahneman does not appear to go as far as to say that humans are irrational. But, he does say that humans do things or make choices that are irrational. The entire concept of the book is to look at these situations where humans do irrational things, to try to identify them, and to provide solutions where solutions are possible that will help to reduce this irrational behavior.

In section 1, Kahneman introduces the concept of the two selves. Kahneman calls them System 1 and System 2.

System 1 is unconscious. It is the part of the brain that identifies where a sound comes from or how far away an object is. 

System 2 is slow and methodical. It is conscious. For example, when you are in a social situation and you are trying to decide the appropriate behavior – that is System 2. You may be evaluating if it is appropriate to use a curse word, or if you should call the neighbor Bob or Mr. Jones.

Inside pages 354 and 355 of Thinking, Fast and Slow by Daniel Kahneman

System 2 is also used when you are trying to complete a difficult task such as parking a car in a tight space in a multi-story car park or when you are taking down someone’s phone number.

Essentially, System 1 is fast. It uses instinct and metaphors to cut through all the noise and provide System 2 with a set of information that it can use to make decisions.

System 2 draws on this information, weighs it up, and makes a decision based upon it. 

In section 2, Kahneman discusses heuristics and biases. Heuristics are mental shortcuts that the brain uses to make decisions more quickly and our biases occur because these shortcuts stop us from seeing the full picture.

Kahneman argues that heuristics and biases are the reason that human beings struggle to think statistically. He goes on to lay out a number of biases that he has discovered during his time working with Amos Tversky.

The anchoring effect is where our decisions are impacted by numbers that have nothing to do with the problem that we are trying to solve.

For example, in an experiment people were asked how old they thought Ghandi was when he died. However, the question was posed in two different ways. When people were asked if they thought Gandhi was 114 years or older when he died they tended to guess a higher number than the group that was asked if they thought Gandhi was 35 years old or younger when he died. 

Similarly, there was an experiment where German judges were told to roll a die before they passed a sentence on a criminal. The judges that were using the dice loaded to give a high number tended to pass higher sentences than the judges where the dice were loaded to give a low number. This completely irrelevant number has infiltrated their thinking when making a crucial decision that was completely unrelated.

The availability heuristic is another mental shortcut. It essentially states that the easier it is to identify the consequences of a decision, the more important it is. The most common example is that of flying versus driving. A person is far more likely to die as a result of a car accident than they are to die as a result of a plane accident. 

However, whenever a plan goes down the consequences are obvious – everyone on board tends to die. As such, more people are scared of flying than they are of driving in a car. When looking at this statistically, it is irrational.

The conjunction fallacy is essentially where we replace a complicated question with a much easier one. This is shown out by Kahneman and Tversky’s most famous experiment known as “The Linda Problem.”

The Linda Problem is where the subjects are told a series of information about a student who is very socially aware, attends protests, and a bunch of similar information along these lines.

The subjects are then asked what is more likely:

  • Linda is a bank teller, or
  • Lind is a bank teller who is a feminist

Because of all the additional information, it was clear to everyone that Linda was probably a feminist if she attended protests and believed in social justice, so most people chose the second option.

The problem is that this is wrong. Regardless of whether Linda is a feminist or not, she is definitely a bank teller. All bank tellers who are feminists are a sub-set of bank tellers. 

The front cover and spine of Thinking, Fast and Slow by Daniel Kahneman

This additional information had acted as a red herring and caused the subjects – many of whom were well versed in statistics – to make a choice that was statistically illiterate. 

Kahneman mentioned that he believed optimistic bias may be the “most significant” of all of the biases that he has outlined in Thinking, Fast and Slow.

This is essentially where we fail to plan for unforeseen risks or circumstances. When planning, Kahneman discusses a framework of what we know and don’t know. 

There are the known knowns. This is what we know that we know. 

There are known unknowns. These are risks that we are aware of it and can plan for accordingly. For example, if you are going to the airport to catch a flight, you may get the earlier bus than the one you need, just in case there is a delay.

Then there are unknown unknowns. These are things that are completely outside of our realm of knowledge or understanding.

Optimistic bias is best shown by the planning fallacy. It is where we tend to overestimate the advantages of a project and underestimate the cost. The most common example of this is Americans redesigning their kitchen.

Americans planning to redesign their kitchens on average planned for and budgeted around $18,000 to 19,000. The average cost of a kitchen redesign at the time of the research was between $38,000 and $39,000.

The framing effect is where given the same information, we make different decisions based on how the question or problem is presented. For example, take an operation in a hospital where the patient has a 10% risk of death and a 90% chance of survival.

When patients were told they had a 90% chance of survival from the operation, they were far more likely to go ahead and get the operation than they were if they were told there was a 10% chance of death.

These two options mean the exact same thing. However, the framing of the question changes the decision we make.

The sunk cost fallacy is where people throw good money after bad. It is shown that even after a project has already shown itself to be a failure, someone who has already invested significant resources into the project will continue to do so rather than call it quits. It is thought that this is to avoid regret.

In part 3 of the book, Kahneman goes on to expand upon overconfidence. The main gist of his teaching is that people overestimate what they know about the world and underestimate the role of chance.

In part 4, Kahneman discusses choices. This part expands upon the work for which he won the Nobel Prize – Prospect Theory.

In section 5, Kahneman goes into more detail about the two selves. He calls them the remembering self and the experiencing self. 

This section is Kahneman’s own work that he went on to do after the death of his long-term academic partner, Amos Tversky. It looks at happiness and the difference between the happiness we experience during an event compared to the way that we remember the happiness of the event.

Kahneman’s main point is that we do not remember an event how we experienced it. We remember the highest or lowest point of the experience and how it ended. Kahneman states:

“Odd as it may seem, I am my remembering self, and the experiencing self, who does my living, is like a stranger to me.”

The back cover of Thinking, Fast and Slow by Daniel Kahneman

Tone of the book

Thinking, Fast and Slow is a complex book that is full of deeply researched content developed over the lifetime of two of the world’s great minds.

It has been distilled into a book that is full of self-help value.

Although it appears to be completely centered on the science, the book is entertaining and Kahneman manages to tug on the heartstrings when discussing certain topics – specifically his relationship with Amos Tversky.

What do readers say about Thinking, Fast and Slow?

Far-reaching implications

Readers mentioned that the book explains simple concepts in great detail and that these concepts have extremely wide-reaching implications for the way that we think and are aware of the way that we are making decisions.

Despite being aware of the problems, specifically the biases, Kahneman has stated previously that he has not made a great deal of progress in overcoming the biases. Therefore some people have challenged the practical application of this section of the book.

Shady and experiments that cannot be replicated

A number of readers discussed the replication crisis in psychology and pointed out that a number of the earlier experiments that Kahneman and Tversky undertook have not been able to be replicated recently.

Others mentioned an experiment where a group of patients were given a colonoscopy that lasted longer than usual and was designed to create additional discomfort to test Kahneman’s theory of the experiencing self versus the remembering self. 

Users mentioned this experiment was unethical. 

They stated the combination of the two factors undermined the rest of the book.

Audiobook review

The audiobook for Thinking, Fast and Slow is narrated by Jonathan Todd Ross. It is 13 hours and 28 minutes long.

Jonathan Todd Ross was credited with doing a good job with the narration. He managed to keep the listeners engaged, even when describe detailed experiments. He had the ability to be clear at the same time, which was appreciated.

However, some listeners did mention that some of the book would have been easier to follow in book form than audiobook form and that they had to rewind and relisten to entire sections at a time.

A woman reading Thinking, Fast and Slow by Daniel Kahneman

Should I read Thinking, Fast and Slow?


  • Changes how to take in information
  • Forces you to think more deeply before making decisions


  • Experiments undermine the rest of the content


Thinking, Fast and Slow is not an easy read but it is a worthwhile read.

At almost 500 pages long it can start to get slightly repetitive towards the end of the book. But, Kahneman manges to distill a lifetime of work into a single book that has influenced the way the world now uses data, statistics and algorithms to try to remove human bias from decision making – especially when it comes to the field of talent identification.