Share this page



 Thinking, Fast and Slow



Daniel Kahneman


      Daniel Kahneman


Essentials


Publication year: 2011


Publisher: Allen Lane (paperback)


Pages: 481


Who should read?


Everyone. It's a monumental book. Not to be missed.


Book Review: Thinking, Fast and Slow

UnputdownableUnputdownableUnputdownableUnputdownable


Wednesday, 29 July 2015 2:17:07 PM

Archived under Behavioral Economics


A tour de force!

 

Daniel Kahneman's 'Thinking, slow and fast' is a rarefied gem, an exceptional book, distinguished by the clarity of thought and attention to detail. It's a quintessential compendium that spans its author's entire glorious career.


Kahneman's daunting reputation precedes him by several thousand miles. A Nobel laureate in economics (wait, isn't he a cognitive psychologist?), Kahneman was born in Israel and did most of his initial research at the University of Tel Aviv. Together with his long-standing intellectual partner Amos Tversky, who passed away in 1996, Kahneman became an inspiring beacon for the dissident voices arising from the world of traditional economics.


Kahneman spent a good many years of his research building evidence to challenge the rational agent theory. The theory assumes that all humans are rational, self-seeking agents who unfailingly tend to gravitate towards decisions that maximize their utility. Think of Mr Spock from Star Trek. 'Thinking, slow and fast' reinforces the thought that humans bear remote proximity to the rational model and more often than not, make irrational choices under the influence of subconscious biases.


At the center of the book is the recurring theme that we know far less about ourselves than we think we do. Our subconscious biases often intercept us unaware and veer us off from the path of reason into the path of fallacies. One such cognitive quirk is the Sunk-Cost Fallacy. Our propensity to stay invested in losing propositions. Kahneman explains that the fear of suffering a potential loss on an investment could be traumatic. Let's say your spouse books two tickets for the late evening show of the latest blockbuster. Unluckily, torrential rains and thunderstorms lash across your neighborhood the same evening. The local Met department issues a warning that venturing out could be risky. How would you react? A lot of people, having invested in the ticket money would still prefer to go out and watch the movie. Ideally, you should go with the best (and the safest) option available - staying indoors in this case - especially when the ticket money has left your bank account and is not coming back. Now all this is easier said than done as the regret of a potential loss and the annoyance caused by a ruined evening are bound to interfere with your rationality. Such biases often fly under our conscious radar. Unless we are consciously alert about them - something Kahneman endeavors to instill in the reader - we run the risk of falling victims to these fallacies.


To put things into perspective, Kahneman deploys two fictitious metaphors in the book - System 1 and System 2 - that explain the fast and slow thinking of our brains. System 1 is the quick, intuitive brain that jumps to the sum of (9+7)  and System 2 is the effortful brain that kicks in when you are asked to work out the average of (15+19) - a relatively demanding cognitive operation. Both systems are active whenever we are awake; it's just that System 2 is more often asleep at the wheel, whereas System 1 is always on autopilot.


Fast thinking System 1 exhibits two fundamental characteristics. Firstly, it always focuses on existing evidence and ignores absent evidence (WYSIATI - What you see is all there is). Let's say that you are an active mutual fund investor. A hi-decibel ad blitz on TV highlighting a fund's recent stupendous performance could tempt you to invest in the fund. Chances are that your skeptical self (System 2) may not even switch on to investigate the fund house's poor record from the years before. This is because System 1 is biased to believe while self-criticism and doubt are the domains of System 2. And, unless hard-pressed, System 2 tends to remain in a dormant state. Secondly, System 1 is a master at creating patterns and inventing causes. It creates causal connections where none exists. Again, lazy System 2 finds itself in full acquiescence to System 1!


Another interesting feature of System 1 is its tendency to overweight low probability events. We always end up overestimating the risks associated with rare but striking events such as air-crashes, terror attacks and natural disasters. Watching scenes of a hurricane blow away peoples' houses on TV could prompt you to check for your house insurance. Same goes for low-probability risks, too. There is a basic limitation in the ability of our minds to deal with small risks, says Kahneman. Either we shrug them off altogether or we give them far too much weight.


Kahneman shines spotlight on our distinctive approach towards the possibilities of gains and losses. Imagine that you can choose between two options: Option A - A guaranteed sum of $50, or Option B - a gamble with an 80% chance of winning $70 and a 20% chance of losing $70. Most people would choose Option A - the sure thing. Now imagine that you can choose between Option A - losing $50 for sure or Option B - a gamble with a 80 % chance of losing $70 and a 20% chance of winning $70.  Most people in this case would go for Option B. Rationally, you would be better off taking the gamble in the first case, but it's hard to be logical in such cases. Kahneman concludes that "losses loom larger than gains" in our minds (Loss Aversion). This leads us to become risk-averse when outcomes involve gains and risk-seeking when outcomes involve loss.


The precarious point is that expanse of human ignorance is not only limited to the details of dual-system metaphor, but extends well beyond it. In an alarming example, Kahneman reveals how a demanding routine in the life of Israeli judges affected the chances of prisoners' chances of parole. The likelihood of prisoners getting a parole tended to about 65 per cent after the judges had eaten a meal, but dwindled towards zero by the time the next meal was due. The explanation was not that hunger makes judges vindictive, but that reasoning demands energy.


Kahneman reveals that frequent repetition of a half-truth or a plain lie can induce familiarity in the listeners so much so that they take it for truth. I am sure the marketers and the dictators have known this for long. Throughout the book, Kahneman catalogues numerous such behavioral fallacies that simply lay waste the idea of rationality being at the center of our decision-making. A few more notable fallacies that hinder our judgment include Anchoring Effect - a bias that occurs when people consider a particular value for an unknown quantity before estimating that quantity and Availability Effect - recent events and the current context have the most weight in determining an interpretation. For example, the harrowing visuals of an air-crash on the morning news could give you the jitters if you have a flight to catch soon. Even the effect of 'facial competence' on voting is about three times larger for information-poor and TV-prone voters than for others who are better informed and watch less TV. A simple Judgment Heruristic as per Daniel Kahneman.


While the book brims with myriads of insights, the ones that are definitely going to stay with me are the importance of using large samples in any research and the importance of 'base rates' when existing evidence is weak. I was quite cognizant of the fact that small samples often lead researchers astray. However, they do show a slight tendency towards the accurate results I thought. I was wrong about the last part! Kahneman reports that researchers who pick too small a sample leave themselves at the mercy of sampling luck. Small samples are more likely to yield more extreme outcomes (both high and low) than large ones. I cite here Kahneman's example in the book to drive the point home: From the same urn, two very patient marble counters take turns. Jack draws 4 marbles on each trial, Jill draws 7. They both record each time they observe a homogeneous sample  all white or all red. If they go on long enough, Jack will observe such extreme outcomes more often than Jill by a factor of 8 (the expected percentages are 12.5% and 1.56%) Again, no hammer, no causation, but a mathematical fact: samples of 4 marbles yield extreme results more often than samples of 7 marbles do.

Kahneman's stated intention is to make the discussions about behavioral quirks mainstream. It manifests in the simple-to-use vocabulary towards the end of each chapter. This begs the question: will those who read this book become consciously alert to their biases and consequently become better decision-makers, too? We can only hope since, despite all the advice, Kahneman remains skeptical about our ability to change and skeptical about his own improvement, too. He quips, I have made much more progress in recognizing the errors of others than my own. In the end how much you learn and improve will be entirely up to you. Gaining cognizance of your biases is only the first step and this book does a solid job at that. Reading 'Thinking, Fast and Slow' puts you on an engrossing journey of unending cognitive revelations. It is an intellectually satisfying and mentally stimulating read; a monumental achievement in the history of thought.