Apotheosis episode 45: Avoiding stupid decisions. Hand-drawn images of three monkeys, one with hands over eyes, another with hands over ears, and the third with hands over mouth. Caption underneath saying "see no reason, hear no reason, speak no reason"

Episode 45: Avoiding stupid decisions

Ryan D Thompson Decision-making, Perspectives, Psychology, Skills

Key ideas

No doubt this is unwelcome news, but your brain is riddled with biases that can lead to unclear thinking and poor decisions. Intentional, systematic decision-making processes can reduce their influence.

  • We are faced with countless decisions every day. To help us make sense of the world and make rapid decisions, our brains evolved a wide range of biases and mental shortcuts called heuristics.
  • Most of the time, these shortcuts serve us well. Other times, they can lead us to make some really dumb decisions. No one is immune to their effects, however smart or accomplished we might be.
  • Our biases operate at a speed faster than thought. Their stealth can cause us to be tricked in all manner of ways, such as failing to see our mistakes or being subtly influenced to spend more money.
  • While we can’t eliminate these biases – and we probably wouldn’t want to, anyway – we can reduce their potential negative impacts by using a systematic method for slowing down and making informed decisions.


Consider for a moment these everyday situations and the human behaviors that often follow. 

Two planes crash within weeks of each other. News reports question whether air travel is becoming less safe, pointing to flaws contributing to the crashes. For weeks afterward, the number of people flying decreases by a statistically significant amount. 

One of your collaborators on a project makes a mistake. The error seems obvious, but fortunately, it is easily fixable. Yet, when asked about it, instead of taking action to address the mistake, they refuse to acknowledge it and then double down on their erroneous behavior. 

A group of magazine editors for a local publication are producing an edition featuring highlights of their town. They compile photos of local people, showing the “faces of fill-in-the-blank town.” Somehow, no one notices that all of the images are of white people, even though the town is far more diverse.

A group of judges are asked to make a sentencing decision for a case involving a hypothetical thief. Before making their sentencing decision, they roll the dice. Those who rolled a higher number ended up making harsher, in other words, longer sentences. Those who rolled lower numbers made shorter sentences.

Each of these cases demonstrates a different, well-documented bias or heuristic — a mental shortcut our brains use to make the hundreds of decisions we make every day. More accurately, these cases show how these biases can go wrong, leading us to make bad decisions. Because of their automatic and subconscious nature, they can create major blind spots in our thinking.

You might think you’re not susceptible to these biases and blind spots. If so, I am genuinely sorry to be the bearer of bad news, but indeed, you most certainly are. Your brain uses these mental shortcuts all day, every day, to make decisions. 

But don’t worry, it’s not just you. It’s me, too. And basically everyone on earth. Ok, maybe we should worry because while many shortcuts our brains use are harmless, many are not. Multiply the biased, bad decision-making of 8 billion people, and the likely result is an exponential increase in bad news. Yep, it’s easy to see how many of our social, environmental, economic, and political issues arise.

This episode is part of a series on decision-making. I’ll explore some ideas from psychology on various biases, heuristics, and blind spots that guide our thinking. These biases operate very quickly and below the level of our conscious thought. Ideally, we can reduce some of the power of these biases by increasing our awareness of them. However, as we’ll see, the truth is that it isn’t so easy to counter our biases.

Before we go any further, I’d also like to clarify that biases and heuristics aren’t always harmful. In fact, more often than not, they serve us well, helping us to make decisions quickly enough to respond to threats and opportunities in the world. They have helped to keep us alive for hundreds of thousands of years, at the least.

However, unlike our ancient ancestors, we face bewildering complexity and far more information coming at us than ever in human history. We have to make a lot more decisions than ever before.

Being aware of the influence of biases on our thinking can help us avoid making really bad decisions, at the least.

One can’t talk about biases and heuristics without bringing Daniel Kahneman into the discussion. For over 50 years, Kahneman has studied how these mental shortcuts operate and influence our thinking. With his longtime research partner, Amos Tversky, he coined the terms for many of the biases and heuristics that are now part of the common language of psychology, behavioral economics, and other fields.

Kahneman’s book, Thinking, Fast and Slow, is like the Bible of how biases and heuristics affect our thinking. In the book, he describes each of the biases I described in the scenarios at the top of the episode – and many, many more. There are so many mental shortcuts at work in our brains that it’s almost hopeless to think we have any chance of breaking free from their influence.

In fact, Kahneman himself — the world’s leading expert on the topic — has mentioned in interviews that after studying these biases for over 50 years, he still finds himself falling into their traps. They operate so quickly and below conscious awareness that it’s nearly impossible to stop them from doing their thing.

However, that doesn’t mean we shouldn’t at least try to reduce their influence and increase the quality of our decisions.

One critical step for countering the detrimental effects of bias is to acknowledge their existence. We’ve all got them, and pretending otherwise invites poor decision-making. It doesn’t matter how smart or well-educated you are; you have the same brain structure and wiring as the rest of us. 

As the late Charlie Munger said, “It is remarkable how much long-term advantage people like us have gotten by trying to be consistently not stupid instead of trying to be very intelligent.” Munger spent many years identifying how otherwise smart people make dumb decisions — frequently due to these unconscious mental shortcuts and blind spots.

Trying to avoid stupid decisions is an excellent motivation to understand and mitigate our biases.

Next, it helps to familiarize ourselves with the array of biases we’re up against. At the top of the episode, I presented four brief examples. The reaction to plane crashes depicted the availability heuristic, where information that is prominent or easily recalled through short-term memory plays an outsized role in how we assess a situation. In this case, the sensationalism in media coverage following the plane crashes leads many people to question the safety of air travel.

The example about the colleague digging in their heels after making a mistake demonstrates the commitment bias. Often, when we have committed to a course of action, especially when there is some public form of commitment, we will stick to our thinking even in the face of strong contrary evidence. We refuse to change our stance, growing even more resolute when people question or criticize us.

The example of the magazine editors failing to notice the lack of diversity in their photos shows the in-group bias, a contributor to the ongoing issues of racial injustice here in America and no doubt elsewhere in the world. This particular example comes from my own hometown. The in-group bias causes us to see people within our group — those from the same race, gender, belief system, etc. — as better than those of other groups. It leads to preferential treatment for our in-group and often poor treatment of other groups.

The fourth example with the judges shows the anchoring effect. This one is strange and hard to believe, but it is well-documented. When making a decision related to numbers — such as prison sentencing or product pricing — the introduction of a baseline number or reference point can strongly influence the decision. This is one reason why big sales can be so effective. When we see any form of discount, a markdown from a previous, higher price, we think we’re getting a phenomenal deal. It might be true sometimes that we’re getting a deal, but this tactic can be used to trick us into buying something we might not have planned to buy. “Hey, it’s a steal!”

These biases are merely a tiny sample of the many ways our brains can deceive us – thus producing low-quality decisions. Each bias has its own baggage, so to speak, and distinct way of working. An organization called The Decision Lab also offers a wealth of resources for understanding these biases and strategies for mitigating their adverse effects on decision-making.

As a caveat, there are a LOT of these biases, and as Kahneman describes, simply knowing about them won’t necessarily remove their influence.

Recognizing that these things are slippery and stealthy, we need intentional, systematic practices to guide our decision-making process. At least as far back as Aristotle, great thinkers have sought to employ various tactics to improve their decisions. On that note, I will cover Aristotle’s  art of deliberation in the next episode. By following a systematic method like this, we slow down the process, leaving more room for assessing the quality of inputs for our decisions and uncovering potential biases and blind spots. 

Perhaps most importantly in this process, we must learn to question everything, even our own knowledge and understanding. We must seek to uncover the hidden motives and biased views influencing our thinking. Uncomfortable, yes – but vital.

That’s all for this episode. Join me again next time for another episode on decision-making, as I dig into Aristotle’s ancient but still practical method for deliberating on difficult decisions. Until the next time, be well!

Recommended reading

Podcast soundtrack credit:

Our Story Begins Kevin MacLeod (incompetech.com)
Licensed under Creative Commons: By Attribution 3.0 License