I consider myself more rational than average. I’m an even-tempered CS major, and a decade working in product and growth trained me to scrutinize intuition with objective data. When my wife and I argue, I try to be rational to the extent that she pushes me to think less and emote more. Before reading Thinking Fast and Slow by Daniel Kahneman, I knew that I still make irrational judgments, but I assumed that when making important decisions I could deliberately put myself into in a rational state of mind. Reading the book changed my thinking.
The book walks through Kahneman’s research into how humans rely on heuristics to function more efficiently and how they often lead to irrational decisions. More importantly, it shows that everyone exhibits these failures of reason regardless of education, expertise, or training. It even gives examples of how Kahneman himself, the leading expert on behavioral economics, consistently falls prey to irrational thinking. The book is full of interesting insights, but three in particular captured my attention.
Loss Aversion
Loss aversion is a well-known concept that I was already familiar with — you feel more dissatisfaction from losing something (ex. money) than satisfaction from gaining something of equal magnitude[1]. However, I never considered the powerful second-order effects of loss aversion that Kahneman explores.
Loss aversion creates a tendency to maintain the status-quo. In order for someone to choose to break away from the norm, the expected value of change must be irrationally positive. Similarly, it leads to the “endowment effect” where we place excessively high value on an item we possess. Suddenly a whole swath of common behaviors — hoarding, remaining in bad jobs and relationships, and the curiously strong pain of unmet expectations start to make sense.
For me, the most surprising consequence of loss aversion is that when presented with a set of bad choices, people become more risk seeking. That has disturbing implications from a societal perspective. Consider a person who is struggling financially and faced with an array of undesirable options (filing for bankruptcy, taking an ultra-high-interest loan, etc). They are likely to choose the option with lowest expected value if it has a minute chance of yielding the best outcome. In other words, human nature may cause people in bad situations to end up in worse situations more often than not. After finishing the book, I constantly find myself considering the role of loss aversion in the world around me — a startup’s sudden pivot when cash is running low, annoyance when a loved one fails to intuit your feelings, and more.
What You See Is All There Is
We make decisions based on the information immediately available to us and as far as our subconscious decision-making is concerned, all other data is outside the scope of reality. Again, confirmation bias, recency bias, etc are all familiar topics, but reading about how core WYSIATI is to our thinking made me consider the ramifications more deeply.
If WYSIATI is true, rational dialogue might always be doomed to fail. Even people who sincerely agree to have a logical discussion will be subject to a stream of unconscious judgments driven by whatever information is most easily accessible. In this scenario, facts become irrelevant, and maximizing the mental accessibility of your desired message (ex. by continuously repeating simple memorable phrases) is the optimal strategy for winning followers. The fact that I’m not immune to these strategies makes me extremely uncomfortable, and you can extrapolate how they may have played a role in recent US politics.
Regression to the Mean
I like this theory because even though I understand it, I feel my mind rejecting it in practice. According to Kahneman, humans are particularly terrible at recognizing randomness and regressions. As a result, we make up completely false but highly convincing stories to explain data that is better explained by random events. Consider the following statement from the book:
Highly intelligent women tend to marry men who are less intelligent than they are.
When I read it I immediately started coming up with explanations for why this is true, but I did not consider why it’s trivially obvious mathematically. If the distribution of intelligence for men and women is roughly equivalent and the correlation of intelligence between spouses has any degree of randomness (ie the intelligence of your spouse cannot be perfectly determined based on your intelligence) then of course a woman who skews to the high end of the curve is more likely to marry a man who is lower on the curve.
This concept sticks with me because even after accepting the statistical explanation, I still feel compelled to attribute it to other, much less likely reasons. Anyone who works in product or growth should keep this in mind when trying to interpret stats. It’s far too easy to get carried away with your own story when you’re really just looking at random data.
Mitigating our irrational selves
I’ve become more concerned about living in a world where we all constantly make irrational decisions. It’s helpful when I remind myself that relying on heuristics isn’t all bad. Most of the time they help us efficiently make rational (or sufficiently rational) choices. If I had to stop and consider every decision I make I’d never accomplish anything.
I’m also starting to craft environments and principles that will help me guard against my blind spots. I know I will demonstrate unconscious bias when making decisions. However, by establishing a rational decision-making algorithm beforehand and adhere even if it deviates from my in-the-moment intuition, I can reduce irrational errors. I found Principles by Ray Dalio useful for helping think through my personal set of principles. Unfortunately, just outlining principles isn’t enough. The next step is to have some set of objective 3rd parties (friends? coworkers? AI assistants like Siri and Alexa?) actively holding me accountable. Figuring that out is still very much a work in progress.
A new favorite
I let Thinking Fast and Slow linger on my to-read list for far too long. I assumed I knew the content based on all the books/posts/podcasts that reference it, and I had recently refreshed my understanding of cognitive biases by reading The Art of Thinking Clearly by Rolf Dobelli. I was finally motivated to pick it up after reading The Undoing Project by Micheal Lewis, which details the friendship between Daniel Kahneman and Amos Tversky that generated the research behind Thinking Fast and Slow. As it turns out, you really need to read the book. Pieces that reference it only give a small peek into the robust mental model that Kahneman presents. In an ideal world, I would suggest first reading the Undoing Project, then reading Thinking Fast and Slow, then skimming The Art of Thinking Clearly and keeping it handy for quick reference. Whether or not you read the others, don’t neglect Thinking Fast and Slow. It’s a fundamental book that everyone should read. It changed how I view myself and the world, it has practical applications, and it raises tricky moral questions (is it wrong to leverage these theories to manipulate others?). It took me awhile to get to it, but it ended up being the best book I read in 2017[2].
Notes
[1] This is an oversimplification that doesn’t account for factors like framing and reference points.
[2] Sapiens by Yuval Noal Horari comes in a very close second.