Episode 14: A machine for jumping to conclusions. “The world makes much less sense than you think. The coherence comes mostly from the way your mind works.” ~Daniel Kahneman

Episode 14: A machine for jumping to conclusions

Ryan D Thompson Complexity, Modern, Psychology, Skills

Key ideas

The world makes much less sense than you think. The coherence comes mostly from the way your mind works.”
~ Daniel Kahneman

  • Humans pride themselves on their ability to reason; it’s what sets us apart from the rest of the animal kingdom. But despite our great brainpower, we also can make some foolish mistakes. We easily fall prey to inaccurate biases and make faulty decisions.
  • Nobel Prize-winning psychologist Daniel Kahneman calls this the “illusion of understanding.” As he describes, “The world makes much less sense than you think. The coherence comes mostly from the way your mind works.”
  • We have two modes of thinking, which Kahneman describes as Systems 1 and 2. System 1 is “fast” thinking, related to intuition, and happens automatically – in fact, we can’t shut it off. And then System 2 is “slow” thinking, the source of reason and rationality; it takes effort to engage and gets tired quickly. 
  • Kahneman describes System 1 as “a machine for jumping to conclusions.” When lacking complete information (which is arguably most of the time), System 1 quickly fills in the blanks with the limited information it does have. Which can get us into trouble.
  • Our minds fell prey to a host of biases, including the narrative fallacy, “what you see is all there is,” and many others. While Kahneman himself says he still gets tricked by his biases after 50 years of studying them. One key takeaway: Don’t believe everything you think.


To make plans for the future, we often study the past. When designing policies or programs related to poverty, nutrition, deforestation, or whatever it might be, we look at what worked, what didn’t, and why. We seek to understand the context and conditions that support or hinder success. And for good reason: we want to invest our time and resources into programs that are likely to work. What better way to do this than to study the past? We analyze past performance to predict the potential success of similar programs in the future. 

But what if our understanding of the past is flawed or missing crucial details? 

Imagine that you’re researching past programs for improving access to clean water for rural communities. You find an extremely successful program involving a cheap and highly portable new water filtration technology. The team that produced the technology was brilliant and talented. They engaged in a collaborative process with the local communities. They made a series of decisions that paid off well — and best of all, they carefully documented their approach to support scaling and replicating their success. Outside evaluators assessed the program and found their methods sound and the outcomes to be verifiably beneficial to the community. 

Will their methods yield success again? Let’s say you could even bring in the same team and follow their process to the tee. Based on what we know so far, it sounds like a safe bet that this technology, process, and the team behind it would be successful again. So should we give it a shot?

Suppose we were to have an opportunity to ask Nobel Prize-winning psychologist Daniel Kahneman for advice. I’d wager he’d tell us “not so fast.” This is the last episode in a series looking at complexity. This week, I’ll explore some insights from the realm of psychology on making sense of complexity, focusing in particular on what Kahneman calls “the illusion of understanding.” As he describes, “The world makes much less sense than you think. The coherence comes mostly from the way your mind works.”

And the way our minds work is that we have two modes of thinking, which he describes as Systems 1 and 2. System 1 is “fast” thinking, related to intuition, and happens automatically – in fact, we can’t shut it off. And then System 2 is “slow” thinking, the source of reason and rationality; it takes effort to engage and gets tired quickly. We tend to identify ourselves with System 2, seeing ourselves as the rational decision-maker calling all the shots in how we go about our day. But much to the chagrin of our rational selves, System 1 is the real driver of most of our daily decisions.

There is nothing inherently “right” or “wrong” about these systems. They are just the ways our brains use to solve problems and make sense of a complex world. One of the lessons from Kahnemann’s work is that System 1 relies on heuristics, or mental shortcuts, to assess and make judgments. These shortcuts save a lot of computing power and time — it’s far better to determine that those eyes in the bushes belong to a lion as quickly as possible rather than risk becoming lunch. 

But as with all shortcuts, these heuristics come with some risk. We often use faulty reasoning and make poor judgments. We unfairly stereotype people. We lose money on lost causes. We waste time and effort because of flawed time estimation.

And as mentioned earlier, we tend to be overconfident in our ability to understand the world.

Kahneman describes System 1 as “a machine for jumping to conclusions.” When lacking complete information (which is arguably most of the time), System 1 quickly fills in the blanks with the limited information it does have. Sometimes this works out just fine. But often, System 1’s shortcuts jump to all the wrong conclusions.

One of the ways we do this is the narrative fallacy. When I was growing up, my parents always used to say, “hindsight is 20/20.” I didn’t understand it so well as a kid, but it makes all kinds of sense now. Just think of how often, after some major event like the pandemic or the economic crash of 2008, suddenly the world is full of people “who saw it all coming.” Or “why didn’t the government do something about that – the signs were all there!” 

We’re suckers for a good story. We often believe that we understand the past. Even further, we often think we can use our understanding of the past to predict the future. Kahneman tells us this is simply because there’s a compelling narrative. The pieces all seem to fit. We think hindsight is 20/20, but in reality, our vision of the past is often nearly as inaccurate as it is of the future.

Going back to the example at the start, we heard about the success of a particular water filtration technology. We heard about the team and their thorough process. Based on the info in front of us, it sounds like a winning approach. 

But what if their success could be attributed to other factors? Perhaps a local community member played a critical role in convincing her neighbors to use this water filtration device, but her crucial contributions weren’t documented. Or maybe there had recently been a new government program that provided incentives to local communities for implementing clean water programs. Another consideration our initial narrative didn’t include: how long does “success” last? We heard that the technology was beneficial and delivered value to the community. But for how long? Maybe the community used the technology for a year or even five years, only to abandon the filtration devices altogether a year later. 

These additional details illustrate what Kahneman calls WYSIATI, or “what you see is all there is.” Our minds tend to examine all of the pieces in front of us and string together a narrative. Unless we know exactly what kinds of details might be missing, we can’t possibly know what we’re missing! We are easily tricked by this blind spot.

Another example, not from Kahneman, but that strikes me as a good illustration, involves what’s called the survivorship bias. During World War 2, the Allied forces tried to figure out how to strengthen their fighter planes. When the aircraft came back from battle, they would look at where the bullet holes were – and then started looking at ways to reinforce those areas. Fortunately, a mathematician named Abraham Wald saw the data from a different perspective. Rather than considering the damaged parts of the planes that returned, what happened to those that didn’t return? They were probably hit in other, more vulnerable areas. His insight led to the Allied forces reinforcing planes in areas with no bullet holes. 

This tendency to focus only on the ones that succeeded or survived is an easy trap. What you see is all there is. We’re rarely asking the question: but what didn’t happen? We can easily get caught up in the story before us and forget to consider that we’re missing crucial details.

As a communications professional, someone who at times describes himself as a storyteller, I admit that I find these particular insights from Kahneman to be a gut punch. Makes me concerned that I’m sometimes deceiving myself about my level of understanding since I do indeed love a good story. But recognizing this potential for stories to be misleading, I feel obligated to do my best to understand the limitations and pitfalls of the narrative fallacy.

So, how do we overcome biases like the narrative fallacy? 

We’ll be disappointed if we think there’s an easy way to escape our biases and flawed thinking. Kahneman has studied these phenomena throughout his 50+ year career and says he still falls prey to the biases he studies. One of the few defenses against erroneous thinking I’ve heard Kahneman describe involves mitigating the influence of our intuitions. Not suppressing or ignoring them, but rather not trusting them as completely as we often do. We typically make decisions based on our intuitions — and then look for data to back up what we’ve already decided in our guts. Instead, we can swap this around: we can first examine objective data and later allow these data to guide our intuitions. In this way, we stand a better chance of making informed decisions.

Kahneman gives an example of this approach. When he was just out of grad school, he worked for the Israeli army to develop a better interview method. The goal was to devise a system that could more accurately predict success for Army candidates. First, he developed a set of objective criteria about qualities that contributed to success for soldiers. Then, he asked the interviewers to review candidates based on those criteria prior to conducting an interview. The interviewers initially rejected his system. They felt they were simply robots checking off the boxes, no longer relying on their intuition honed from years of experience. Kahneman then tweaked the approach to allow room for their subjective judgments, their intuitions — but at the end of the process, after they reviewed the objective criteria. This combined approach turned out to be the best predictor of success of all the interview methods previously tried.

One of the key takeaways I get from studying Kahneman is “don’t believe everything you think.” Acknowledging our minds’ shortcomings can be an essential step towards better decision-making. There’s a lot more to learn from Kahneman, and I’ll definitely be revisiting his insights in future episodes. This wraps up our series on complexity for now. Join me soon for more explorations of ancient wisdom and modern knowledge. Be sure to subscribe for more episodes. And please share this with a friend if you think it will be useful to someone. Until the next time, be well!


Podcast soundtrack credit:

Our Story Begins Kevin MacLeod (incompetech.com)
Licensed under Creative Commons: By Attribution 3.0 License