What is Confirmation Bias?
The confirmation bias describes our underlying tendency to notice, focus on, and give greater credence to evidence that fits with our existing beliefs.
Where this bias occurs
Consider the following hypothetical situation: Jane is the manager of a local coffee shop. She is a firm believer in the motto ‘hard work equals success.’ The coffee shop, however, has seen a slump in sales for the past few months. Because of her belief in the effectiveness of ‘hard work’ as a means to success, she concludes that it is because her staff is not working hard enough.
This makes sense, as she did recently catch several employees taking extended lunch breaks. Jane consequently decides to extend the store’s business hours and threatens to dismiss any employee she sees slacking. Despite these efforts, coffee sales do not improve, and the shop is now spending more on employee wages.
She then decides to consult with other coffee shop managers in the area, who identify her store’s new, less visible location as the cause of her sales slump. Jane’s belief in hard work as the most important metric of success led her to mistakenly identify employees’ lack of effort as the reason for the store’s falling revenue while ignoring evidence that pointed to the true cause: the shop’s poor location. This is a result of the confirmation bias, which caused Jane to notice and give greater credence to evidence that fit with her pre-existing beliefs.
This bias can lead us to make poor decisions because it distorts the reality from which we draw evidence. Under experimental conditions, decision-makers have a tendency to actively seek information and assign greater value to evidence confirming their existing beliefs rather than entertaining new ones. This can be considered a form of bias in evidence collection. Conclusions we draw from biased evidence are more likely to be false than those we draw from objective evidence. This is because they are farther from reality.
In the aggregate, individual confirmation bias can have troubling implications. If we’re so deeply entrenched in our preconceptions that we only consider evidence that supports them, broader socio-political cooperation (that often requires considering other viewpoints) can be hindered. Major social divides and stalled policy-making may begin with our tendency to favor information that confirms our existing beliefs and ignore evidence that does not.
Why it happens
Confirmation bias is a cognitive shortcut we use when gathering and interpreting information. Evaluating evidence takes time and energy, and so our brain looks for shortcuts to make the process more efficient.
Our brains use shortcuts
These shortcuts are called “heuristics.” There is debate whether or not confirmation bias can be formally categorized as a heuristic. But one thing is certain: it is a cognitive strategy that we use to look for evidence that best supports our hypotheses. The most readily available hypotheses are the ones we already have.
It makes sense that we do this. We often need to make sense of information quickly, and forming new explanations or beliefs takes time. We have adapted to take the path of least resistance, often out of necessity.
Imagine our ancestors hunting. An angery animal is charging towards them, and they only have a few seconds to decide whether to hold their ground or run. There is no time to consider all the different variables involved in a fully informed decision. Past experience and instinct might cause them to look at the size of the animal and run. However, the presence of another hunting group now tilts the chances of successful conflict in their favor. Many evolutionary scientists have stated that our use of these shortcuts to make quick decisions in the modern world is based on survival instincts.
It makes us feel good about ourselves
Another reason we sometimes show confirmation bias is that it protects our self-esteem.
No one likes feeling bad about themselves — and realizing that a belief we value is false can have this effect. Deeply held views often form our identities, so disproving them can sometimes be painful. We might even believe being wrong suggests we lack intelligence. As a result, we often look for information that supports rather than disproves our existing beliefs.
This can also explain why confirmation bias extends to groups. In an influential 2002 peer-reviewed paper, clinical psychologist Harriet Lerner and political psychologist Phillip Tetlock posit that when we interact with others, we tend to adopt similar beliefs in order to better fit into the group.
They call this confirmatory thought, “a one-sided attempt to rationalize a particular point of view.” This is juxtaposed with exploratory thought, which entails “even-handed consideration of alternative points of view.” Confirmatory thought in interpersonal settings can produce “groupthink,” in which the desire for conformity in the group results in dysfunctional decision making. So, while confirmation bias is often an individual phenomenon, it can also take place in groups of people.
Why it is important
As mentioned above, confirmation bias can be expressed individually or in a group context. Both can be problematic and deserve careful attention.
At the individual level, confirmation bias affects our decision-making. Our decisions cannot be fully informed if we are only focusing on evidence that confirms our assumptions. It can cause us to overlook pivotal information both in our careers and in everyday life. A poorly informed decision is more likely to produce suboptimal results because it has not taken stock of the environment in which it is made. A voter might stand by a candidate while dismissing emerging facts about the candidate’s behavior. A business executive might fail to investigate a new opportunity because of a negative past engagement with similar ideas. Moreover, someone who sustains this sort of thinking may also accordingly be labeled ‘close-minded.’ It is good to approach situations and the decisions they call for with an open mind. Awareness of confirmation bias is the first step.
At a group level, it can produce and sustain the aforementioned “groupthink” phenomenon. In a culture of groupthink, decision-making can be hindered from the assumption that harmony and group coherence are the values most crucial to success. This reduces the likelihood of disagreement within the group.
Imagine if an employee at a technology company did not disclose a revolutionary discovery she made for fear of reorienting the firm’s direction. Likewise, this bias can prevent people from becoming informed on the differing views of their fellow citizens, and by extension, engaging in the constructive discussion that many democracies are built on.
How to avoid it
When we make decisions, this bias is most likely to occur when we are gathering information. It is also likely to occur subconsciously, meaning that we are probably unaware of its influence on our decision-making.
As such, the first step to avoiding confirmation bias is being aware that it is a problem. By understanding its effect and how it works, we are more likely to identify it in our decision-making. Psychology professor and author Robert Cialdini suggests two approaches to recognizing when these biases are influencing our decision making:
- Listen to your gut feeling. We often have a physical reaction to unfavorable requests, like when a salesperson is pushing us too far. Even if we have complied with similarly unfavorable requests in the past, we should not use that precedent as a reference point.
- Recall past actions and ask yourself: “knowing what I know, if I could go back in time, would I make the same commitment?”
Second, because the bias is most likely to occur early in the decision-making process, we should focus on starting with a neutral fact base. This can be achieved by having one (or ideally, multiple) third parties who gather facts to form a more objective body of information.4
Third, when hypotheses are being drawn from the assembled data, decision-makers should also consider having inter-personal discussions that explicitly aim at identifying individual cognitive bias in the hypothesis selection and evaluation. While it is likely impossible to eliminate confirmation bias completely, these measures may help manage cognitive bias and make better decisions in light of it.
How it all started
Confirmation bias was known to the ancient Greeks. It was written about by the classical historian Thucydides, who observed that people “entrust to careless hope” what they wish to be true. By contrast, they “use […] reason to thrust aside” what they do not wish to be true.
The phenomenon was first described as confirmation bias by Peter Wason in 1960. In what’s known as Wason’s Rule Discovery Test, he conducted an experiment in which participants were asked to find a rule that applied to a series of three numbers. They were told the numbers ‘2-4-6’ satisfied this rule. To find out what the rule is, Wason told them they could make various other sets of numbers to see if they also satisfied the rule. An examiner would tell them if the conjured numbers satisfied the rule or not.
Most subjects proposed the rule was a sequence of even numbers and would follow this rule by doubling the given numbers in order to test their hypothesis. However, this was not the rule Wason had in mind. The rule was simply that the numbers in the set were increasing.
The experiment showed that most subjects formed a similar hypothesis and only tried number sequences that proved it rather than considering sequences that disproved it. They looked to confirm their own rule instead of breaking it.
Example 1 – Blindness to our own faults
A major study carried out by researchers at Stanford University in 1979 explored the psychological dynamics of confirmation bias. The study was composed of undergraduate students who held opposing viewpoints on the topic of capital punishment, and who were asked with evaluating two fictitious studies on the topic.
One of the false studies given to participants provided data in support of the argument that capital punishment deters crime, while the other supported the opposite view (that capital punishment had no appreciable effect on overall criminality in the population).
While both studies were entirely fabricated by the Stanford researchers, they were designed to present “equally compelling” objective statistics. The researchers discovered that responses to the studies broke down by participants’ pre-existing opinions:
- The participants who initially supported the deterrence argument in favor of capital punishment considered the anti-deterrence data unconvincing and thought the data in support of their position was credible;
- Participants who held the opposing view at the beginning of the study reported the same, but in support of their stance against capital punishment.
So, after being confronted both with evidence that supported capital punishment and evidence that refuted it, both groups reported feeling more committed to their original stance. The net effect of having their position challenged was a re-entrenchment of their existing beliefs.
“Presented with someone else’s argument, we’re quite adept at spotting the weaknesses. Almost invariably, the positions we’re blind about are our own.”
– Elizabeth Kolbert, The New Yorker
Example 2 – Effect of the internet
The “filter bubble effect” is an example of technology amplifying and facilitating our cognitive tendency toward confirmation bias. The term was coined by internet activist Eli Pariser to describe the intellectual isolation that can occur when websites use algorithms to predict the information a user would want to see, and then provide information to the user according to this prediction.
This means that as we use particular websites and content networks, those networks are more likely to serve us content that we prefer, while excluding content that our browsing patterns have shown run contrary to our preferences. We normally prefer content that confirms our beliefs because it requires less critical reflection. So, filter bubbles might favour information that confirms your existing options and exclude disconfirming evidence from your online experience.
In his seminal book, “The Filter Bubble: What the Internet Is Hiding from You”, Pariser uses the example of internet searches for an oil spill to show the filter bubble effect:
“In the spring of 2010, while the remains of the Deepwater Horizon oil rig were spewing crude oil into the Gulf of Mexico, I asked two friends to search for the term ‘BP’. They’re pretty similar — educated, white, left-leaning women who live in the Northeast. But the results they saw were quite different. One of my friends saw investment information about BP. The other saw news. For one, the first page results contained links about the oil spill; for the other, there was nothing about it except for a promotional ad from BP.”
If this were the only source of information that these women were exposed to, surely they would have formed very different conceptions of the BP oil spill. The internet search engine showed information tailored to the beliefs their past searches showed, and picked results predicted to fit with their reaction to the oil spill. Unbeknownst to them, it facilitated confirmation bias.
While the implications of this particular filter bubble may have been harmless, filter bubbles on social media platforms have been shown to influence elections by tailoring the content of campaign messages and political news to different subsets of voters. This could have a fragmenting effect that inhibits constructive democratic discussion, as different voter demographics become increasingly entrenched in their political views as a result of a curated stream of evidence that supports them.
What it is
Confirmation bias describes our underlying tendency to notice, focus on, and give greater credence to evidence that fits with our existing beliefs.
Why it happens
Confirmation bias is a cognitive shortcut we use when gathering and interpreting information. Evaluating evidence takes time and energy, and so our brain looks for such shortcuts to make the process more efficient. We look for evidence that best supports our existing hypotheses because the most readily available hypotheses are the ones we already have. Another reason why we sometimes show confirmation bias is that it protects our self-esteem. No one likes feeling bad about themselves– and realizing that a belief they valued is false can have this effect. As a result, we often look for information that supports rather than disproves our existing beliefs.
Example #1 – Blindness to our own faults
A 1979 study by Stanford researchers found that after being confronted with equally compelling evidence in support of capital punishment and evidence that refuted it, subjects reported feeling more committed to their original stance on the issue. The net effect of having their position challenged was a re-entrenchment of their existing beliefs.
Example #2 – Establishing personalized networks online
Modern preference algorithms have a “filter bubble effect,” which is an example of technology amplifying and facilitating our cognitive tendency toward confirmation bias. Websites use algorithms to predict the information a user wants to see, and then provide information accordingly. We normally prefer content that confirms our beliefs because it requires less critical reflection. So, filter bubbles might exclude information that clashes with your existing opinions from your online experience. Filter bubbles and the confirmation bias they produce has been shown to influence elections and may inhibit the constructive discussion democracy rests on.
How to avoid it
Confirmation bias is most likely to occur when we are gathering the information needed to make decisions. It is also likely to occur subconsciously; we are most likely unaware of its influence on our decision-making. As such, the first step to avoiding confirmation bias is being aware of it. Because confirmation bias is most likely to occur early in the decision-making process, we should focus on starting with a neutral fact base. This can be achieved by having one (or ideally, multiple) third parties who gather facts to form a more objective body of information.