Kahneman and Tversky - Pt. 2 | #39

Cognitive biases and heuristics

Kahneman and Tversky - Pt.2

No. 39 — read time 10 minutes

Welcome to The Soloist, a weekly newsletter where I share timeless ideas and insights about life, business, and creativity.

Today at a glance

  • How our minds play tricks on us

I started writing about Daniel Kahneman and Amos Tversky a few weeks back before the brutal attack by Hamas on Israel on October 7th and the subsequent Israel-Gaza War. At the time, I was fascinated by the story of two these two different people who became inseparable during their professional lives and together changed the course of behavioral economics and cognitive psychology.

As a dual citizen born in Israel and with family still living in Israel, the recent turmoil has shook my world causing me to pause my writing temporarily. Now as I get back to the thing I love most, telling stories and drawing lessons from history’s greatest minds, I come back to the story of Amos and Danny.

I hope you enjoy this week’s newsletter and look out for more in the coming weeks and months. Appreciate every one of you!

The Rules of Regret

In the 1600’s, French noblemen sought the counsel of mathematicians to help them understand how to make better decisions around gambling.

The idea behind expected value was born. If someone offers you a coin flip where you gain $100 if it lands on heads and lose $50 if it lands on tails, the expected value is $25 (i.e. 0.5*$100 + 0.5*(-$50)). The idea being that if someone offers you a bet with a positive expected value, you always take the bet.

But we know that’s a huge oversimplification of how humans behave. We take negative expected value bets all the time. If we didn’t, Las Vegas and the Insurance industry wouldn’t exist. We are constantly failing to maximize expected value.

The reason why is what intrigued Danny and Amos.

But before we get to that, let’s talk about someone who modified the expected value theory to add a bit more nuance.

About a century after those French noblemen sought the advice of mathematicians, a Swiss mathematician named Daniel Bernoulli codified behavior analysis with his work on expected utility to give a better description of how humans behaved than simple expected value calculations.

Let us suppose a pauper happens to acquire a lottery ticket by which he may, with equal probability, win either nothing or 20,000 ducats. Will he have to evaluate the worth of the ticket as 10,000 ducats, or would he be acting foolishly if he sold it for 9,000 ducats?

Daniel Bernoulli

Bernoulli’s groundbreaking idea, which became the mainstay of decision analysis for a long time, was that people did not seek to maximize expected value but rather expected utility.

In simpler terms, utility is the value someone ascribes to money.

You and I might have very different ideas about what one incremental dollar means to us.

According to Bernoulli, people were “risk averse”.

And when Amos wrote his Foundations of Measurement book, he described risk aversion as “the more money one has, the less he values each additional increment, or equivalently, that the utility of any additional dollar diminishes with an increase in capital”.

The point wasn’t that people behaved this way because of Bernoulli’s principle, only that it seemed to describe what some people did in the real world.

But Amos, ever looking to pick apart arguments, was dissatisfied with expected utility theory to explain how humans actually behaved.

One person who — French economist Maurice Allais. In 1953, Allais asked an audience of economists to imagine two scenarios:

Scenario 1:

You must choose between having:

  1. $5 million for sure

  2. An 89% chance of winning $5 million, a 10% chance of winning $25 million, or a 1% chance to win zero.

Most of the audience chose the guaranteed $5 million. No surprise there.

Then he moved on to the second scenario.

Scenario 2:

You must choose between having:

  1. 11% chance of winning $5 million, with an 89% chance to win zero

  2. 10% chance of winning $25 million, with a 90% chance to win zero

Most of the audience chose the second option. In other words, they chose a slightly lower odds of winning a lot more money.

What’s interesting about this problem, which became known as The Allais Paradox, is that not only do the overriding preferences survey participants contradict themselves, the whole ordeal destroys the idea of expected utility theory. (TK link to Allais problem showing math).

While Allais, Amos, and other economists tried to puzzle through this seeming contradiction as a logic problem, Danny saw something else.

He sensed this less a logic problem and more quirk of human behavior — regret. And he didn’t see any contradiction.

In Scenario 1, anyone who turns down a guaranteed $5 million would experience tremendous regret compared to someone who chose to turn down a gamble with a chance to win $5 million (but could win $0).

Avoiding that pain became a line item on the inner calculation of their expected utility theory.

Michael Lewis, The Undoing Project

The nuance here is important. It’s not regret that makes people take a decision, but rather the anticipation of regret.

People dwell on “what might have been”, an acute ingredient of misery, though rarely does anyone dwell on “how much worse things could have been” as a part of their happiness.

We simply do not seek to avoid other emotions the way we seek avoidance of regret.

That was the discovery Danny made. That people “did not seek to maximize utility. They sought to minimize regret”.

And by formulating this theory, the duo set to create some rules around regret.

The first rule is that regret is associated with feelings of “coming close”. The second rule is that regret is closely linked to feelings of responsibility.

In the Allais Paradox, people anticipated regret not from failing to win but from the decision to walk away from certainty.

While this may sound academic, it had real world consequences. A “sure thing” is equivalent to the status quo. And humans anticipate the status quo if they fail to take action. We can’t be certain we’d be happier if we would have taken a certain job or a different spouse and that absence of knowledge protects us from knowing the truth about the quality of our decisions.

And so by going down this road of destroying expected utility theory they uncovered the underlying principle behind “risk aversion”. They defined it as “the fee that people paid, willingly, to avoid a regret: a regret premium”.

Strong Ideas, Loosely Held

If you’ve ever heard the phrase “strong ideas, loosely held”, that is Daniel Kahneman and Amos Tversky personified.

At one point, Danny said to his colleagues “I get a sense of movement and discovery when I find a flaw in my thinking”.

While reviewing their notes from the survey participants towards the end of 1974, a thought occurred to them. “What if we flipped the signs?”

Instead of focusing on gains, the duo would shift their attention to losses.

Scenario: Which of the following do you prefer?

  • Gift A: A lottery ticket that offers a 50 percent chance of losing $1,000

  • Gift B: A certain loss of $500

When the certainty was flipped from a sure gain to a sure loss, people all of a sudden avoided certainty. Our unhappiness from losing an object is greater than the happiness in receiving the same desirable object. This is the crux of Loss Aversion.

That meant their theory of regret was flawed. It failed to explain why in scenarios where you’re dealing with potential loss, people flip from being risk avoiding to risk seeking.

The main hinge point on which this seesaw of emotions plays out rests on what we now call framing. The idea that you can shift a person’s attitude towards risk seeking or risk avoiding by how a scenario is presented, either as potential for loss or potential for gain.

This removal of context in the human brain they dubbed The Isolation Effect.

The Fourth Heuristic

You sleep through your alarm and wake up 15 minutes late. You quickly pack your bags, throw on some clothes and head for the airport. Your driver takes a wrong turn and gets stuck in a traffic jam. You make it to the airport, hop out the Uber and race to the check-in desks. You find out you’ve missed your flight…

Which would be more upsetting: missing the flight by 5 minutes or 30 minutes?

In 1974, under the direction and funding of The United States Department of Defense and the Advanced Research Projects Agency (DARPA) in 1974, Danny and Amos wrote a groundbreaking paper titled Judgment Under Uncertainty: Heuristics and Biases.

In it, they outlined several heuristics humans rely on to make sense of the world.

In the previous piece on Daniel Kahneman and Amos Tversky, we looked at the Anchoring, Availability, and Representativeness heuristics.

But by the end of 70’s, when Danny went to look back on his notes from studying regret, something else became clear to him.

Regret was a counterfactual emotion.

So too was frustration and envy.

The mind begins to imagine different scenarios to “undo” a negative state, and even in this state of imagination the mind follows certain rules around unrealized possibilities.

Danny and Amos committed to studying the power of unrealized possibilities and how they contaminated people’s minds.

They called it “The Simulation Heuristic”.

Some of the hypothetical disaster vignettes Danny proposed:

  • A shopkeeper was robbed at night. He resisted. Was beaten int he head. Was left alone. Eventually died before robbery was noticed.

  • A head-on collision between two cars, each attempting to overtake under conditions of restricted visibility.

  • A man had a heart attack, tried in vain to reach the phone.

  • Someone is killed by a stray shot in a hunting accident.

His list went on for 8 pages.

In trying to understand how to “undo” these events in the mind, he arrived at several rules.

One rule was that the more items there were to undo in order to create some alternative reality, the less likely the mind was to undo them.

The Undoing Project, Michael Lewis

Another rule, dubbed The Focus Rule by Danny, was that when we think about events happening to others, we keep the situation the same in our minds but move the actor. The exception being when the event happens to ourselves, we do the inverse.

And the most important rule revolved around the unexpected.

A middle-aged banker takes the same route to work every day. One day he takes a different route and is killed when a drugged-out kid in a pickup truck runs a red light and sideswipes his car. Ask people to undo the tragedy, and their minds drift to the route the banker took that day. If only he had gone the usual way!

But put that same man back on his normal route, and let him be killed by the same drugged out boy in the same truck, running a different stoplight, and no one thought: If only he had taken a different route that day! The distance the mind needed to travel from the usual way of doing things to some less ordinary way of doing things felt further than the trip made from the other direction.

The Undoing Project, Michael Lewis

In the scenario above, rarely does the mind think about the far more likely probability that either the drugged-out boy or the man were off by a few seconds thereby avoiding a collision.

The fourth heuristic, which became part of Danny’s larger body of work loosely titled The Undoing Project was his official split from Amos. The two grew apart and Danny, for the first time, ventured out on his own to put forth a theory of how the human mind behaves.

Both The Undoing Project, by Michael Lewis, as well as Danny’s seminal book Thinking Fast and Slow are must-read books that help us understand not only how the human brain works, but the path these two academics took over 40 years to decipher it.

P.S. Whenever you're ready, there are 3 ways I can help you:
  1. If you save a lot of bookmarks on Twitter (like me), try dewey.
    the easiest way to organize Twitter bookmarks (I'm one of the makers).

  2. If you're looking for 1:1 coaching on audience or business growth book a slot here.

This week’s newsletter is brought to you by Beehiiv.

Beehiiv is the only email service provider with a built-in referral program. Explode the growth of your newsletter by using the most powerful persuasion tool out there, word of mouth.

Start growing your newsletter faster here.

If you enjoyed today's newsletter, consider sharing it with friends and family. If they don’t hate you, they might thank you.

If this email was forwarded to you, consider subscribing to receive them in future. |