Biases and Beliefs

Reading time: 5 minutes

Here's a look at what the research suggests underlies the cognitive biases behavioral scientists have identified and popularized.

On the left-hand side, you'll see the prior belief that we look to confirm.

In the middle, you'll see the cognitive bias we're all prone to engage that leads to biased processing of information that confirms that prior belief.

Beliefs

Beliefs are "hypotheses about some aspect of the world" that we feel reasonably certain about. Most of us think that our beliefs are accurate and that they're based on good data, which leads us to feel that sense of certainty. You’ll learn later where this confidence in our accuracy comes from, and how this leads us astray (a mental phenomenon that occurs across cognitive processes, including thinking and memory).

There's a good deal of research that suggests our minds are uniquely tuned to generate beliefs about the world. We can form beliefs based on repeatable patterns of data, but we can also form beliefs based on little or no data at all (like superstitions). Because our brains are constantly searching for sense and meaning, belief formation, even in the absence of real data, gives our minds a simple framework for organizing and categorizing the experiences we have and making them easier to process. We’re naturally prone to categorizing, which drastically reduces our cognitive load.

What's even more confusing is that some of these beliefs with no data are helpful for us.

For example, people who subscribe to nearly any form of organized religion reap the benefits of feeling connected to something bigger than themselves and are often psychologically healthier than their agnostic and atheist counterparts. Religious narratives promote a feeling of predictability, control, and justice, each of which helps us to feel better about the world we live in. Despite the fact that there’s no empirical support for any particular religion, subscribing to one can boost your psychological health.

The power of belief is best illustrated, though, in the research on the placebo effect. For many people and many medical procedures or outcomes, simply believing that something will help can bring about meaningful change and healing.

Breaking down beliefs

There are two types of beliefs that we hold.

The first is what's called a "philosophical belief," but might better be understood as an (un)informed opinion.

These beliefs are things like "the world is good", "ramen is the best type of noodle" (false), or "climate change is real." Irrespective of the actual, factual information supporting any of these claims, these philosophical beliefs are things we arrive at based on our own experience and our interpretations of these experiences.

The second type of belief is one we're all familiar with - "knowledge." These beliefs are evidence-based.

When it comes to biases, understanding these two types of beliefs matters because:

  • Beliefs don't have to be true.

  • Beliefs may or may not reflect real processing or depth of thought.

  • Beliefs can reflect a wide range of certainty.

  • Beliefs can be tested, partly tested, or not testable at all.

Each of these factors contributes to the information people seek subsequently to confirm what they already believe. There's ample evidence that beliefs form the foundation for how we perceive the world, and that we tend to look for features in the environment that conform to our beliefs than alternative hypotheses (Zuckerman et al., 1995).

This selective perception leads to a host of cognitive issues downstream.

For example, people often mistakenly perceive information as confirming an existing belief, and they quickly discredit information that is inconsistent with their underlying beliefs. People also stick to their guns, despite disconfirming evidence.

We perform all sorts of mental gymnastics to maintain what they believe to be true.

Anyone who's ever gotten into a meaningful argument has seen these gymnastics first-hand. The research identifies a range of acts we perform to make our beliefs make sense, like:

  • Positive testing: the tendency to scan the environment for information that is more likely to fit what we already believe

  • Congeniality bias: people choose belief-consistent information over inconsistent information.

  • Biased assimilation: people tend to misperceive (make a cognitive error) and identify new information as confirming their prior beliefs, even when it in fact conflicts.

  • Motivated reasoning: people discredit information inconsistent with prior beliefs

  • Belief perseverance: people tend to stick to their beliefs, despite evidence to the contrary

  • Subtyping: categorizing new information that doesn't fit as a special category, like "exceptions to the rule"

Biases

What the above mental gymnastics illustrate is our tendency toward bias and the systemic processing of information to confirm our existing ideas.

At each stage of processing (attention, evaluation, reconstruction, and search) we are motivated to find information that matches our existing beliefs of the world. And, because there are so many possible ways to make the information fit what we believe, you can imagine the full repertoire people deploy internally and externally to make the world conform to their view.

What makes this process even stickier is that some research shows that people engage in these same cognitive errors even when there is no real skin in the game. As long as a belief is present, bias will show up.

What the research suggests is that the biases we have are based on a consistent set of beliefs that we have about the world, and the way we naturally process information illuminates these biases in different contexts.

Many of these beliefs are commonly shared across people - things like "I make correct assessments" or "my experience is a reasonable reference (the world revolves around me)". When we pair these beliefs with our tendency to process information in a way that confirms our own reference point, we end up with cognitive errors that seem obvious from the outside looking in and make complete sense inside out.

In the table above, you’ll see the list of commonly held beliefs that tend to cut across people. And, in discovering this list, you’ll probably also find things that you or your loved ones struggle with.

At work, the most common fallacies people fall into include:

  • The spotlight effect: believing that people pay more attention to you than they actually do.

  • Better-than-average effect: believing that we are better than most people at most things (like when your average worker believes they’re due for a raise because they met but didn’t exceed standards)

  • Fundamental attribution error: ascribing personal failures to external causes and other people’s failures to internal causes

As you can see, each of these is based on a fundamental belief about the world, paired with the tendency we all have to process new information in a way that confirms, or at the very least doesn’t support, our current view.

What this research means for all of us is that we have to work harder to see the world more accurately. Further, doing so is going to be scary because we’re likely to find some things that fundamentally shake our belief system. Making good decisions based on real data often means setting aside what we want or believe to be true in service of finding the truth that lives “somewhere in the middle.”

For coaches and leaders, this can be especially challenging.

High performers often rise to the top in part because of an unshakeable belief in themselves, and as a result, an unshakeable belief in their beliefs. As they climb, however, and deal with more people, the perspective has to shift from “I am right” to “I’d like to find what’s right.” That requires suspending judgment, searching for new information, being willing to be wrong, and entertaining multiple views and arguments at once.

The best high-performance knowledge workers I’ve been around take a long time to form beliefs and work hard to tease apart themselves from the data they take in. They’re able to hold multiple ideas in mind and can identify which idea is based on their own belief (read: their own biases), which ideas might have come from the biased processing of others, and which ideas are most based on fact.

Ultimately, this type of leadership leads to the creation of an environment that emphasizes finding what’s right versus being right. This philosophy leads to the best outcomes over time, because people aren’t focused on moving their own agendas forward but instead focus on what moves the group or organization forward.

Everyone wins when the best ideas, based on the best data, drive decisions.

Reply

or to participate.