Thinking, Fast and Slow - Daniel Kahneman
3-Sentence Summary:
We think in 2 systems (system 1 and 2), of which the first is fast, immediate, automatic, intuitive, and of which the second is more slow, thoughtful, logical, and effortful, with both them working together to make decisions, while hopefully avoiding the biases, fallacies, and excessive narratives that we can often fall prey to.
Humans are not “Econs” (rational, selfish and unchanging economic decision-makers focussed only on the outcome): expected utility, regret, possibility effects, and certainty effects also factor into our decisions.
Our “two selves”, the experiencing self (you in the moment), and the remembering self (you when you think about your life as a story), work together to determine the quality of your life with “experienced well-being” and you “life evaluation” being two separate but connected elements of that.
Notes:
We think in 2 systems
System 1: Thinking fast: Automatic, immediate, intuitive, often effortless, impulsive, usually our first reaction.
System 2: Thinking slow: Thoughtful, slow, often effortful, somewhat logical, usually interjects when system 1 can’t do that job and then usually has the final say, even though it is heavily influenced by system 1.
System 2 can require effort/discipline and may only have a certain amount of bandwidth, beyond which ego depletion can occur (a decrease in our ability to maintain discipline), which can be sped up by tiredness and complexity/difficulty of task, and performance can diminish in subsequent tasks as a result.
For this reason, the ability to use system 2 is often related to our ability to delay gratification.
Practice and increased skill can make things more automatic (delegating to system 1) decreasing the demand on system 2.
There are times where system 2 tasks don’t require so much effort, such as when we are in “Flow” - A state of effortless concentration, losing the sense of time, yourself, and problems.
Priming - words, images, sounds etc influence how we think and act. What are we/society exposing ourselves to? E.g. Hearing words associated with old people can make us walk more slowly. E.g. smiling can make us answer questions more positively.
Cognitive ease - System 1 tends to be used more when we’re happy, at ease, in a familiar environment, and clear on what’s happening. Even something as simple as presenting a message with easy to read font makes it more believable. The opposite brings system 2 into play, and things get analysed more.
Narrative fallacy - We create narratives to represent the world more clearly and make it possible for us to operate (system 1), even if we are reducing logic (system 2). System 2 can be trained to recognise this and intervene when needed. Confidence in prediction is likely to simply be a result of a well-constructed narrative.
WYSIATI - “What you see is all there is” - We tend to forget about the information we don’t have and emphasise the importance of the information we do have.
Small samples tend to have more variability/unreliability, since larger samples even out the anomalies, but we tend to create a narrative around the anomaly and prioritise it.
Affect heuristic - When we don’t know how to answer a question or approach a situation, we ask an easier question, which will usually involve how we feel about the situation. E.g. “Will x company survive?” Becomes “Do I like the product/owner?”.
Anchoring - The first number we hear affects the answer we give to a subsequent question, or where we start with negotiation.
Availability bias - we search our minds for recent occurrences to inform our opinion on the matter. The more recent, the more emotive, and the higher number of examples all lead to more buy-in (availability cascade). 2 plane crashes on the news increases our predictions of subsequent plane crashes regardless of the overall statistics.
Adding detail to something can make it seem more likely, whilst actually decreasing the likelihood. E.g. Independently, men (of any age) dying of a heart attack will be rated as less likely than a man over 50 dying of a heart attack, even though the former includes the latter.
Striking visual examples are far more convincing than statistics (System 1 vs. system 2).
Regression to the mean - The further something is from the mean, the more likely the next will be closer to the mean. E.g. The biggest pup of the litter is likely to grow up to have a pup that will grow up to be smaller than it was. E.g. If you’ve had a really good sales month, presume that the next month will be closer to your previous mean. We often think the opposite.
Because system 1 is easily fooled by these biases and heuristics, we should look at the likelihood of something being true (base rates) (system 2), rather than trusting our feelings (representativeness) (system 1).
Where we can replace human judgements with a logical formula for decision making, strongly consider it.
Human expert intuition can be trusted in cases where the area is sufficiently regular/replicative, there is a feedback loop, and there has been enough time to learn from it.
We should take an “outside view” using previous cases/data rather than falling to a “planning fallacy” or inside view that is usually a best-case scenario.
Optimism is needed for entrepreneurs etc, but this does end up in failure more often than not, showing that it is often over-confidence. Doesn’t mean you shouldn’t do it, but should be aware of the risks.
Humans are not Econs (rational, selfish, consistent decision-makers, as they’re often thought to be by economists). Our decisions are also majorly affected by emotions.
Expected utility theory - We don’t see the absolute risks/payoffs, we see the extra potential utility to us. E.g. £100 to someone who has £5 is different to someone who has £500. Useful but not sufficient.
Prospect theory - calculates the weighted chance of winning x amount (90% chance of winning £100 = £90. 100% chance of winning £90 = £90). This doesn’t take into account human emotion (disappointment, regret, certainty etc), as the first case (high chance of winning £100), is obviously very different to the latter example (definitely winning £90).
Also, the prospect of a loss affects us more negatively than the same gain does positively, leading to loss aversion.
The further away the gain/loss is from baseline, the less extra utility we expect. E.g. For most people, there is a much bigger difference in £1000 vs £2000 than there is in £1000000 vs £101000.
There is also more to the decision weighting theory. These become visible at the extremes:
Due to the “Certainty Effect”, we are:
Risk-averse in potential gains when the probability is high (moving from 95% chance to 100% chance) - willing to forego some potential extra pay-off to go from a maybe to a definite (take lower settlement than expected rather than going to court, even if likely to win).
Risk-averse in potential losses when the probability is low (moving from 5% chance to 0% chance) - willing to pay extra to give certainty of no loss (buying insurance).
Due to the “Possibility Effect”, we are:
Risk-seeking in potential gains when the probability is low (moving from 0% chance to 5% chance) - willing to pay extra to go from no chance to some chance (buying a lottery ticket).
Risk-seeking in potential losses when the probability is high (moving from 100% chance to 95% chance) - willing to pay extra to go from definite loss to some chance (rejecting a good settlement in court in a last-ditch effort to get away with not paying, when I’m likely going to have to pay “a lot” whether I’m found guilty or take the settlement).
If we have the chance to replicate in the above scenarios, we should have an outside view in order to avoid foregoing more money than we should have, or missing out on potential gains.
The possibility of regret is weighted more than the possibility of gain.
For this reason, we over-weight the probability of negative rare events, because we don’t want to regret not over-reacting.
We also tend to not veer from our usual actions, as making the decision to do so would cause more regret if something were to go wrong as a result in comparison to something going wrong after sticking to our normal actions.
We shouldn’t fall into a “Sunk-cost fallacy”, where we’re basing our future decisions on trying to correct the regret of our previous poor decisions. We should ask what is the right decision going forward.
If we’re able to remind ourselves of the fact that we knew there were risks before going in, regret can be lessened.
When we see individual cases, we’re more likely to use system 1 (emotional). When we compare it to other cases, we engage system 2 (more rational. E.g. Donate to dolphins vs. donate to farmers. When asked individually, people donate more to dolphins, when asked together, people donate more to farmers. We should aim to not see situations in isolation, but compare to the bigger picture.
The exact same final outcome/risk will elicit a different response if it is framed as a win/keep/lose, opt-in/opt-out etc. We can use this to our advantage in framing our decisions, but should be wary of it being used against us.
There are two selves:
The experiencing self - you in the moment, who you are as you are doing things.
The remembering self - you when you think about your life as a story.
We rationally say we want long more pleasure that is longer-lasting, and pain to be short-lived, but our decisions predominantly take into account the most intense moments (peak), and the feeling we had as the experience was ending, (peak-end rule) whereas duration doesn’t affect the decision much (duration neglect). E.g. Cold water challenge that lasts longer but ends with warm water will be chosen over a shorter challenge with no warm water at the end.
In this way, we prioritise our remembering self, allowing ourselves to experience more total pain, as long as the peak is lower, and the ending is pleasant.
In the same way again, higher life evaluation is not as simple as the sum of the individual moments of experienced well-being (i.e. it is not about “feeling happy all the time”). E.g. Positive things that happened recently, having kids, more money, better education can all lead to you having a higher evaluation of your life (remembering self) even if your experienced well-being (experiencing self) on a day-to-day basis isn’t always positive. There is obviously still a balance between the 2 selves, and happiness isn’t simple. We need to look after both, because one affects the other.
If you enjoyed this summary, you’ll probably enjoy the full book. Get it here: LINK TO BOOK (AMAZON)
Or get it for free on audiobook when you sign up for an Audible account: LINK TO AUDIOBOOK (AMAZON)
(This website uses amazon referral links as part of the Amazon Associates program.)