• Momentum
  • Posts
  • A Framework for Spotting Cognitive Biases

A Framework for Spotting Cognitive Biases

How our mind consistently processes information and leads us to erroneous conclusions

Mental performance wasn't what first hooked me into psychology.

What caught my attention originally was Daniel Kahneman's work, Thinking, Fast and Slow, and the different factors that lead to biased processing and the ways that people make sense of the world. Once you learn about these biases, you can see them everywhere and in everyone.

Your understanding of human psychology and behavior can improve significantly just by grasping these concepts. You’ll realize that we’re pretty bad at spotting mistakes in our thinking, we tend to make the same mistakes repeatedly, we do live in a world that we believe revolves around us, and we have a hard time finding any sort of objective “truth.”

In performance and flourishing anywhere, this error matters a lot.

I believe that much of performance is a function of decision-making. What people decide to do and when often is the difference between a "good" or "bad" outcome at the highest level of sport, not whether or not they have the skill.

For example, most NBA shooters have a form good enough to get the ball in the hoop. It's not a matter of "how." It’s a matter of when and where.

Only a handful take consistently good shots. It's a matter of the decisions they make, and it influences their status in the NBA, their career earnings, and their chances of winning a title.

Applying the framework of cognitive biases to sport performance, including the work I do in the NBA draft and have done in other talent identification and selection spaces, allows for quicker, more accurate decisions and to spot the processes that derail a good decision-making process.

If we want to understand how people make the decisions they make and why they behave the way they behave, we have to start with understanding the ways people commonly make sense of the world and the strengths and pitfalls of those sense-making strategies.

A Starting Point for Understanding Cognitive Biases

A new research paper (Oeberts & Imhoff, 2023) suggests that cognitive biases all share the same recipe.

That formula:

Prior belief + belief consistent information processing (i.e., confirmation bias).

In other words, people come into situations with a prior understanding or mental model of the circumstances and then seek out information that confirms their understanding of the world.

This framework is consistent with predictive processing theory, which suggests that the brain is consistently producing predictions about the world and what we believe will happen next, based on prior experiences and expectations.

Once the gap between what we predict and what we experience is essentially 0, the brain can rest easy and stop trying to correct for a gap (what's been called an "error"). Because our brains are trying to optimize for efficiency before accuracy, it pays to process the world in a way that’s consistent with our existing mental models, rather than constantly updating to accommodate the slight nuances and variations that exist in our day-to-day experience.

In the case of cognitive biases, if we process the information we experience in a way that's consistent with our prior beliefs, we arrive at the conclusion of those predictions faster. It's much less taxing on our minds and our energy efficiency system, and as a result, is the preferred mechanism for getting to comfort and resolution as quickly as possible.

This approach, unfortunately, also leads to mistaken conclusions and premature completion of the processing.

The end result is more mistakes or misreads.

If you want to start making more accurate decisions, then, the starting point is forcing yourself to work just a bit harder in the way that you process the world around you. It’s asking yourself questions, consciously challenging the conclusions you arrive at, and questioning your gut instinct, which some data suggests is the feeling of concluding a search for information rather than an accurate assessment of the situation (see Noise by Kahneman for more on that).

Since decision-making lives at the heart of high performance for most professions and performers, recognizing our unhelpful (but efficient) tendencies to conclude early and to search for information that matches our existing hypotheses can help us uncover where we regularly make mistakes.

The mistakes I commonly see are:

  • The Spotlight Effect: Believing that people are thinking about you more than they actually are because you think about yourself a lot!

This happens all the time in professional sports, and for any high performer who’s ever felt like crap after giving a public talk or presentation. We are so good at identifying what we think we could do better that we believe other people are thinking about how badly we did, too.

For most people I know who have had this experience, there’s a huge mismatch between what happens after their presentation and what’s happening in their minds. While the crowd is congratulating them, they’re busy berating themselves and telling themselves the crowd doesn’t mean it.

That’s an error in information processing.

  • Anchoring: Rigidly holding a starting opinion based on information from either a trusted source or past experience, without adjusting for new information.

If you’ve ever argued with someone who clearly values one aspect of a decision more than any others, chances are you’re dealing with someone who’s anchored.

Anchoring typically happens in negotiations, where someone gives either a high or low number to alter the other party’s expectations and expected value of the deal. But it can also happen in other contexts, where people expect others to conform to their view or make points to dismiss and discredit one part of a theory.

  • Loss Aversion: The tendency to value avoiding loss over adding gain.

If you ask most people, they’d prefer to not lose $50 over finding $50, even though adding $50 is objectively better in this instance (nobody is saying you have to lose the $50, after all).

What we typically think of as “conservative decisions” are often tendencies toward loss aversion.

Of course, these are just a few examples. Wherever you look and any discussion you have, chances are you’ll find a bias (or two, at least) there.

PS:

Thank you for the feedback on last week’s format. Moving forward, I’ll

be breaking down my longer-form articles into shorter pieces, and then sharing the completed product in last week’s format at the end.

Reply

or to participate.