Notes from Thinking Fast and Slow

This post is a collection of insightful concepts and statements I found in Daniel Kahneman’s Thinking Fast and Slow. Apologies for the lack of coherence towards the end – the volume of the book ensured I got down to just jotting down the meat of the matter without any context or detailing.

  • Fundamental premise of the author’s work: Economics says humans are mostly rational and their thinking is sound. The departures from rationality only occur due to emotions such as fear, affection and hatred. The author documents systematic errors in the thinking of humans – these are errors due to how our cognitive machinery is designed, not a corruption of our thoughts by emotions.
  • Expert intuition: Thousands of hours of practice in anything (mostly your work) in a controlled environment with a good feedback loop can set you for taking good decisions just by “blinking” instead of having to “think”. A good example is a firefighter anecdote, where an experience guy ordered his team to head out of a  house with no visible signs of danger. Only in hindsight did he realize that his subconscious was processing subtle danger signs related to smoke and fire. Note that this only works with controlled environment, with no underlying randomness (cases of firs are different, but there is an underlying pattern to the kinds of houses and kinds of fires they deal with, and the behavior of materials reacting to fire) as opposed to environments with fundamental randomness, such as stock markets. The stock traders doesn’t have a reliable, non-random feedback loop – so he cannot develop such “blink”-style judgement expertise.
  • Notion of two Systems: A human mind has two systems – System 1 is intuitive, guided by associative memory (and hence affected by feelings) and FAST. System2 is deliberate, responsible for complex logic, reasoning and calculations, any kind of decision-making that needs effort, but it is also SLOW and LAZY. Much of the book focuses around understanding these two systems, how most humans rely more on System1 than they need and the errors arising due to biases wired into System 1. System 1 is flawed, yet it also very good at constructing coherent stories.
  • Notion of two selves: The “experiencing self” and the “remembering self”. Human memories are not perfect reconstructions of reality. Humans are guided by the remembering self and this makes them expose themselves to unnecessary pain.
  • Limited attention power of humans: As demonstrated in the selective attention experiment.
  • “We can be blind to the obvious and we are also obvious to our blindness.”
  • Illusions: Muller-Lyer illusion is a famous example of how our cognitive machinery is flawed (even after understanding the concept, it presents a reality that is hard to accept). Not all illusions are visual – many are cognitive, and those are the costly ones.
  • Example of a cognitive illusion: Psychotherapists often have a strong attraction for a patient with a repeated history of failed treatments, thinking that they may be the ones to succeed in curing him. (Hint: the patient is a psychopath!).
  • Flaws of System 2: System2 is lazy and reluctant to invest more effort than is strictly necessary. It believes it has chosen thoughts and actions, but these choices can be, in reality, guided heavily by System 1. The concept of priming is an example of this.
  • On pupils and psychology: The pupil of the eye is a window into one’s soul – if you perform a slightly non-trivial calculation with a periodic frequency with a camera focused on your eye, it will record very regular pupil dilation and contraction events. The pupil dilation is an indication of mental effort – pupils contract immediately when a person gives up or finds the solution.
  • On why training matters for System 2: Unlike a house’s circuit breaker, which completely breaks down in case of overload, System2 will focus its attention on the most important activities, letting go of the rest. The selective attention test video linked to above is proof of that. With training and over time, as you get more and more familiar with a task, fewer brain regions are involved.
  • State of flow: Flow is a state of effortless concentration,  so deeply focused on whatever one is doing at the moment that one loses a sense of self and all one’s problems. Flow leads to optimal experiences. All variants of voluntary effort – cognitive, emotional or physical – draw atleast partly from a shared pool of mental energy.
  • “Ego depletion”: This demonstrates the “shared pool of mental energy” concept. People instructed to stifle their emotional reaction to an emotionally charged film will later underperform on a test of physical stamina. Another example – studies show that favorableness of results from judges increases in the short time after lunch. Tired and hungry judges tend to fall back on the easier default possibilities of denying requests for parole.
  • Over-riding intuition takes hard work: The insistent idea that “its true, its true” coming from System 1 makes it difficult to check the logic. Lots of mental effort goes into training your System 2 to kick into action more often, whenever needed, but its worthwhile. Some lucky people are more like System2, most of us are more like our System 1.
  • Strong link between self-control and cognitive aptitude: The Stanford marshmallow experiment is a good example of this.
  • Concept of Priming: If you saw the word EAT and then saw SO_P, you are more likely to complete it as SOUP. If you saw WASH and then saw SO_P, you will complete it as SOAP. The experience you are subjected to (reading a word) triggers a portion of your associative memory, and so now your retrieval is biased.
  •  Priming as applied to experiences: Priming is not just about words, ideas and thoughts. In an experiment, young people who were asked to assemble a sentence from scrambled words related to old age, walked slower than the others in part 2 of the experiment when they had to walk down a hallway.
  • Influencing of an action by an idea – the ideomotor effect. Clasp both your hands together and index fingers of both hands pointing at each other, now think of the line joining the tips of the two fingers and see how your fingers twitch!
  • Reciprocal priming: Thoughts/ideas/words influence actions – that is priming. The inverse of it is reciprocal priming. Smiling naturally invokes positive, optimistic emotions and thoughts.
  • “Act calm and kind regardless of how you feel” – you are likely to be rewarded by actually feeling calm and kind.
  • Useful resources on priming: understanding how you are less in control of your actions than you think, and how priming affects your performance.
  • Money-priming leads to increased self-reliance – money-primed people (exposed to thoughts of money) persevered almost twice as hard in trying to solve a very difficult problem. Money-primed people are also more selfish.
  • “They were primed to find flaws and this is exactly what they found.”
  • Cognitive ease – when things are going normally. Cognitive strain – when you need more help from System 2.
  • The cognitive ease inflow-outflow machine: Things that lead to cognitive ease are repeated experience, favorable conditions such as a clear display and good font, primed idea (such as, being introduced to the concept through a nice subtle watermark in the background) and a good mood. What comes out of cognitive ease is the following feelings/emotions – “feels true” , “feels good”, “feels familiar and effortless”
  • “Cognitive easy is good and recommended in some cases, but dangerous in other cases (where it prevents System 2 from kicking in and makes the wrong judgements via System 1).” As an example, the performance of Princeton grads in puzzles went up by a notch when the font worsened, causing cognitive strain and system 2 to kick in.
  • Cognitive Ease: Pros: More creative. Cons: Makes judgement errors due to familiarity.
  • Cognitive Strain: Pros: More vigilant, suspicious, invests more efforts. Cons: Feels less comfortable, leads to less intuitiveness/creativeness.
  • “Illusions of remembering”: You think David Steinbill (made-up name) is a celebrity just because the experimenters exposed you to his name in a casual setting before the experiment!
  • “Illusions of truth”: If you are put at sufficient cognitive ease by the mood and environment around you and by previous similar questions,  you may agree to the phrase “chicken has 4 legs”. It takes a while for System 2 to kick in and tell you this is not true.
  • “Anything that makes the cognitive machine run smoothly will also bias beliefs”
  • “Familiarity is not easily distinguishable from truth”
  • “If a statement is strongly linked by logic or association to other beliefs or preferences you hold, or comes from a  source that you trust or like, you will feel a sense of cognitive ease.” This is because the lazy System2 will just accept the suggestions of System 1 and march on.
  • Awesome example of cognitive ease effects: Stocks with pronounceable tickers perform better than the tongue-twisting ones! (like PGX or RDO). 
  • Mind is biased toward “causal thinking” and doesn’t understand statistics or regression to the mean easily. Good example of regression to the mean – ask a bunch of people to take 2 attempts at darts. The ones who did the best in the first attempt will get worse (relative to themselves) in the second attempt. This causes instructors to (faultily) conclude that admonishing leads to better performance in the next attempt and praise leads to worse performance in the next attempt.
  • Intensity Matching: Humans unknowingly transfer evaluation from one situation to another (which merits a different way of thinking/evaluating) applying “intensity matching”. Trivial example – give a long description about a school girl really sharp at reading and then ask the audience to predict her college GPA. Reading skills at age 7-10 have nothing to do with GPA, yet instinctive answers from the audience indicate “intensity matching”.
  • Common human fallacies while making predictions (or forming “intuitions” about situations): neglect of base rates, and insensitivity to the quality of information.
  • Intuitive predictions tend to be overconfident and overly extreme.
  • Moderating the extremeness of intuitive predictions is not always a good thing – example, venture capitalists need to call extreme cases like Google correctly, even at the cost of overestimating the prospects of many other ventures.
  • Consider the range of uncertainty around a most likely outcome. (KR note: This is reminiscent of Aswath Damodaran’s recent lecture at Google about valuation – where he defines a range of variation on every single parameter in his valuation, so his end prediction for the stock price of Apple is a large histogram)
  • The most valuable contribution of the corrective procedures that the author proposes to fix “wrong” intuitions is that they force one to think about how much they know.
  • WYSIATI – Human mind’s bad fallacy – what you see is all there is.
  • Asked to reconstruct our former beliefs, we often bring up our current ones instead and many cannot believe they ever felt differently.
  • Hindsight bias – people cannot reconstruct their past beliefs accurately. This makes it difficult to evaluate a decision properly – in terms of the beliefs that were reasonable when the decision was made.
  • The worse the consequence, the greater the hindsight bias (KR note: reminiscent of parents saying “I told you so” when things go south in a marriage they were arm-twisted into agreeing to).
  • Hindsight and outcome biases foster risk aversion.
  • System 1 makes us see the world as more tidy, simple, predictable and coherent than it really is.
  • Halo effect: Depending on whether the company has been doing well or not recently, the same CEO will be called “flexible, methodical and decisive” or “confused, rigid and authoritarian”.
  • Therefore, you can’t do pattern mining to identify what works for successful companies. Examples of this kind of failure are “Built to Last” and “In search of excellence” (companies/theses mentioned in both the books melted down in a short period following the book).For some of our most important beliefs, we have no evidence at all, except that people we love and trust hold these beliefs!
  • Amazing experiment the author does with 25 top financial advisors – he gets a spreadsheet that has them ranked (with data on how much returns they generated), on an annual basis, for 8 consecutive years. He studied correlations between year 1 and 2, year 1 and 3…. and so on until year 7 and 8 (28 correlation coefficients, one for each pair of years). The average of the 28 corelation coefficients was 0.01!!
  • People with the most knowledge are poorer at forecasting than people with some knowledge of the field/domain/situation. With knowledge, the person develops an enhanced illusion of skill and becomes unrealistically overconfident.
  • Simplicity is often way better than complexity. Paul Meehl’s “little book” – simple, statistical rules are superior to intuitive ‘clinical’ judgements. Book by Gary Klein – “Sources of Power” – Analyzes how experienced professionals develop intuitive skills. Malcolm Gladwell’s book “Blink” is along the same lines.
  • Emotional learning is similar to Pavlo’s experiment – the dog had “learned hopes”. Learned fears are even more easily acquired (KR note: perhaps more applicable to women, being the more emotionally susceptible gender).
  • Lesson learnt: Train yourself hard, in a regular environment with a good feedback loop so that the “expert intuition” that you are developing, actually holds true.
  • When you see data that seems to define a BASE RATE, ACKNOWLEDGE IT, LET IT CHANGE YOU.
  • Planning fallacy – overestimate benefits, underestimate costs.
  • How to overcome the planning fallacy – Develop an “outside view” by involving a ton of “reference class data” about similar projects.
  • Sunk-cost fallacy – You didn’t have a reasonable baseline prediction when you started out, and when you get the baseline you ignore it because it’s too late in the game and you’re already invested.
  • More optimistic people are the ones who are inventors, politicians, military leaders etc. They take more risks than they think they are capable of.
  • Entrepreneurs are inherently more optimistic. Chances that a small business survives in USA for > 5 years are 35%. 60% of new restaurants are out of business after 3 years.. yet people still open new ones and are optimistic about them.
  • Study shows CFOs are grossly over-confident about their ability to forecast the market.
  • Pre-mortem: Just before an important decision has been finalized but not committed – you imagine you are in the future, the decision has failed, and look at what could have wrong. The main virtue of the premortem is that it legitimizes doubts.
  • “What rules govern peoples’ choices between simple gambles and between gambles and sure things?” Amos and Kahneman set out to understand humans make choices, without assuming anything about their rationality.
    Econs (perfectly rational, selfish, stable taste) & Humans (WSYIATI, tastes change, not fully rational or fully selfish).
    Bernoulli’s experiments/conclusions: “A risk-taker with diminishing marginal utility for wealth (which is most of us) will be risk-averse.”
    – Flaw in Bernoulli’s theory: You need to know the reference before you can predict the utility of a given amount of wealth.
  • People become risk-seeking when all their options are bad.” 🙂 This is insightful. Simple example – which would you pick – lose $100 for sure or lose $200 with 50% probability?
  • Loss aversion ratio for most people is in the range of 1.5-2.5 (potential gains have to be that factor higher than potential losses).
  • Brains of humans and other animals contain a mechanism that is designed to give priority to bad news.
  • Goals are reference points. Avoiding the failure to meet a goal is a stronger motivator than the desire to exceed it.
  • Perceptions of fairness are based on our reference points.
  • “Altruistic punishment” (punishing a stranger for behaving unfairly towards another stranger)
  •  Smart negotiation tactic: falsely hold on to something as very precious or important to you, thereby showing that you’ll stand to be pained a lot by giving it away, when in reality, you were prepared to give it away all along.
  • Consistent overweighting of improbable outcomes – a feature of intuitive thinking, leads to inferior outcomes.
    – Opportunities to frame a fact differently, such that one way of framing evokes a different mental/emotional response. (e.g. probability of DNA testing failure – defendant will say “1 in a 1000”, accuser will say “0.01” – because accuser wants to show DNA testing works for certain, defendant wants to create doubts in the jury’s head).
  • Human nature tends to be risk averse for gains and risk-seeking for losses – it is COSTLY to do so! You should favor taking risks in gain-scenarios and reduce risk-seeking when it comes to losses.
  • Countering the “loss aversion” mentality that you were wired with – take a $100 loss with 50% chance, $200 gain with 50% chance gamble. Offered one gamble you will probably pass it ($100 loss will feel more painful than $200 gain). However, if offered 100 of these, no fool should reject it (compute the expected value there – the chances of you losing money are insanely low- like 1 in 32000).
  • So, the next time you think of loss aversion, think of life as a bunch of these small gambles – you win a few, you lose a few, but the chances of you losing overall in the long run are slim.
  • Be rational enough to avoid your loss aversion.
  • “Combination of loss aversion and narrow framing is a costly curse.”
  • Mitigating loss aversion: evaluate your portfolio only a quarter, else loss aversion will make you overtly sensitive to minor fluctuations and make you react on daily lows.
  • Investors sell more losing stocks in December, when taxes are on their mind. The tax-loss harvesting advantage is available all year, but then mental account prevails more during the other 11 months (“disposition effect” -think you did well by selling a gainer instead of a loser).
  • The “sunk cost” fallacy is both identified and taught as a mistake in business and economics courses, and there is evidence that graduate students in these fields are more willing than others to walk away from a failing project.
  • “Losses evoke stronger negative feelings than costs.” e.g. “Would you accept a gamble that offers a 10% chance to win $95 and 90% chance to lose $5?” or “would you participate in a lottery ticket worth $5 where you have a 10% chance of winning $100 and 90% chance of winning nothing?” The two problems are identical but people like the second one way way more.
  • Interesting example: Credit card industry lobbied hard to make vendors say it’s a “cash discount, not a credit surcharge.” (if you are paying different amounts for cash v/s card). People will much more easily forego a discount than they will take a surcharge.
Advertisements