The existing judgments leave us less inclined to change the opinions with which we disagree

0
673

An international research team comprising neuroscientists at Virginia Tech, University College London and the University of London revealed brain mechanisms and functional regions that underlie confirmation bias — a phenomenon where people strongly favor information that reinforces their existing opinions over contradictory ones.

The study, published this week in Nature Neuroscience, provides insight into a fundamental property of belief formation that has been documented by psychologists and economists, as well as in popular literature, including George Orwell’s “1984.”

People tend to disregard information that conflicts with their past choices, no matter how authoritative or factual the new information may be.

“We are watching this all over the news,” said P. Read Montague, a professor and director of the Human Neuroimaging Laboratory at the Fralin Biomedical Research Institute at VTC, and an honorary professor at the Wellcome Center for Human Neuroimaging at University College London.

“It is the mystery of decision-making. People routinely make decisions that cut across their own best interests.

We clearly know this in areas like drug abuse, overeating, or any repetitive activity during which people disregard obviously beneficial advice.”

In the study, participants from Roanoke and Blacksburg, Virginia, arrived at the laboratory in pairs and were introduced to each other before retiring to individual cubicles.

They played a real estate game, made wagers for cash, and then re-evaluated their decisions in light of the wagers made by their partners.

Researchers combined functional magnetic resonance imaging (fMRI) with the behavioral task. Participants’ blood oxygen level-dependent (BOLD) variables were examined through moderated mediation analysis, capturing a relationship between brain activity and multiple levels of performance, and testing whether the mediation is different for conditions of agreement and disagreement.

When participants learned their partners agreed with their opinions, they significantly increased their bets, thus confirming they were confident with their decision. Participants only slightly decreased their wagers when their partners disagreed.

The impact of the partner’s opinion was far greater when it confirmed the player’s judgment, and the partner’s opinion was more likely to be disregarded when it was contradictory — consistent with confirmation bias.

The functional brain imaging data revealed a region whose activity modulation was associated with decision-making and memory.

The posterior medial prefrontal cortex mediated the strength of confirming opinions over disconfirming opinions, and tracked agreements more closely than disagreements.

“We are using functional neuroimaging and computational neuroscience to take apart the mechanisms that look at why we are biased to make one kind of decision versus another, what neural structures are involved, and how do these change across development and across states of health, disease and brain injury,” Montague said.

“In that sense, the study contributes directly to understanding why people make decisions.

It has implications for institutions all across our culture where people make judgments that are either against or congruent with their own interests.”

When participants learned their partners agreed with their opinions, they significantly increased their bets, thus confirming they were confident with their decision. Participants only slightly decreased their wagers when their partners disagreed.

Montague is also the director of the Fralin Biomedical Research Institute’s Computational Psychiatry Unit, and a professor in the department of physics in Virginia Tech’s College of Science, and in the department of psychiatry and behavioral medicine in the Virginia Tech Carilion School of Medicine.

Terry Lohrenz, a research assistant professor in the Montague laboratory at Fralin Biomedical Research Institute, worked on the design and execution of the research.

Andreas Kappes, of the University of London, and Tali Sharot, of University College London, are the corresponding and senior authors of the study.

Ann H. Harvey, also on the study team, is formerly a research scientist at the Fralin Biomedical Research Institute now with the Museum of Science and Industry in Chicago.

Funding: The research was funded by a Wellcome Trust Principal Research Fellowship to Montague.


1 The Confirmation Bias
The confirmation bias is based on finding that people tend to listen more often to information that confirms the beliefs they already have. Through this bias, people tend to favor information that confirms their previously held beliefs.

This bias can be particularly evident when it comes to issues like gun control and global warming. Instead of listening to the opposing side and considering all of the facts in a logical and rational manner, people tend simply to look for things that reinforce what they already think is true.

In many cases, people on two sides of an issue can listen to the same story, and each will walk away with a different interpretation that they feel validates their existing point of view. This is often indicative that the confirmation bias is working to “bias” their opinions.

2 The Hindsight Bias
The hindsight bias is a common cognitive bias that involved the tendency of people to see events, even random ones, as more predictable than they are.

In one classic psychology experiment, college students were asked to predict whether they thought then-nominee Clarence Thomas would be confirmed to the U.S. Supreme Court. Prior to the Senate vote, 58% of the students thought Thomas would be confirmed. The students were polled again following Thomas’s confirmation, and a whopping 78% of students said they had believed Thomas would be confirmed.

This tendency to look back on events and believe that we “knew it all along” is surprisingly prevalent. Following exams, students often look back on questions and think “Of course! I knew that!” even though they missed it the first time around. Investors look back and believe that they could have predicted which tech companies would become dominant forces.

The hindsight bias occurs for a combination of reasons, including our ability to “misremember” previous predictions, our tendency to view events as inevitable, and our tendency to believe we could have foreseen certain events.

3 The Anchoring Bias
We also tend to be overly influenced by the first piece of information that we hear, a phenomenon referred to as the anchoring bias or anchoring effect. For example, the first number voiced during a price negotiation typically becomes the anchoring point from which all further negotiations are based. Researchers have even found that having participants choose a completely random number can influence what people guess when asked unrelated questions, such as how many countries there are in Africa.

This tricky little cognitive bias doesn’t just influence things like salary or price negotiations. Doctors, for example, can become susceptible to the anchoring bias when diagnosing patients. The physician’s first impressions of the patient often create an anchoring point that can sometimes incorrectly influence all subsequent diagnostic assessments. If you ever see a new doctor and she asks you to tell her your whole story even though everything should be in your records, this is why. It is often the physician, or analogously anyone trying to get to the bottom of a problem, who discovers a vital piece of information that was overlooked as a result of the anchoring bias.

4 The Misinformation Effect
Our memories of particular events also tend to be heavily influenced by things that happened after the actual event itself, a phenomenon known as the misinformation effect. A person who witnesses a car accident or crime might believe that their recollection is crystal clear, but researchers have found that memory is surprisingly susceptible to even very subtle influences.

In one classic experiment by memory expert Elizabeth Loftus, people who watched a video of a car crash were then asked one of two slightly different questions: “How fast were the cars going when they hit each other?” or “How fast were the cars going when they smashed into each other?”

When the witnesses were then questioned a week later, the researchers discovered that this small change in how questions were presented led participants to recall things that they did not actually witness. When asked whether they had seen any broken glass, those who had been asked the “smashed into” version of the question were more likely to report incorrectly that they had seen broken glass.

5 The Actor Observer Bias
The way we perceive others and how we attribute their actions hinges on a variety of variables, but it can be heavily influenced by whether we are the actor or the observer in a situation. When it comes to our own actions, we are often far too likely to attribute things to external influences. You might complain that you botched an important meeting because you had jet lag or that you failed an exam because the teacher posed too many trick questions.

When it comes to explaining other people’s actions, however, we are far more likely to attribute their behaviors to internal causes. A colleague screwed up an important presentation because he’s lazy and incompetent (not because he also had jet lag) and a fellow student bombed a test because she lacks diligence and intelligence (and not because she took the same test as you with all those trick questions).

6 The False-Consensus Effect
People also have a surprising tendency to overestimate how much other people agree with their own beliefs, behaviors, attitudes, and values, an inclination known as the false consensus effect. This can lead people not only to incorrectly think that everyone else agrees with them—it can sometimes lead them to overvalue their own opinions.

Researchers believe that the false consensus effect happens for a variety of reasons. First, the people we spend the most time with, our family and friends, do often tend to share very similar opinions and beliefs. Because of this, we start to think that this way of thinking is the majority opinion even when we are with people who are not among our group of family and friends.

Another key reason this cognitive bias trips us up so easily is that believing that other people are just like us is good for our self-esteem. It allows us to feel “normal” and maintain a positive view of ourselves in relation to other people.

7 The Halo Effect
Researchers have found that students tend to rate good-looking teachers as smarter, kinder, and funnier than less attractive instructors. This tendency for our initial impression of a person to influence what we think of them overall is known as the halo effect.

This cognitive bias can have a powerful impact in the real world. For example, job applicants perceived as attractive and likable are also more liable to be viewed as competent, smart, and qualified for the job.

Also known as the “physical attractiveness stereotype” or the “what is beautiful is ‘good’ principle” we are either influenced by or use the halo to influence others almost every day. Think of a product marketed on TV by a well-dressed, well-groomed, and confident woman versus a woman who is poorly dressed and mumbling. Which appearance would be more likely to prompt you to go out and buy the product?

8 The Self-Serving Bias
Another tricky cognitive bias that distorts your thinking is known as the self-serving bias. Basically, people tend to give themselves credit for successes but lay the blame for failures on outside causes.

When you do well on a project, you probably assume that it’s because you worked hard. But when things turn out badly, you are more likely to blame it on circumstances or bad luck. This bias does serve an important role; it helps protect our self-esteem. However, it can often also lead to faulty attributions, such as blaming others for our own shortcomings.

9 The Availability Heuristic
After seeing several news reports of car thefts in your neighborhood, you might start to believe that such crimes are more common than they are. This tendency to estimate the probability of something happening based on how many examples readily come to mind is known as the availability heuristic. It is essentially a mental shortcut designed to save us time when we are trying to determine risk.

The problem with relying on this way of thinking is that it often leads to poor estimates and bad decisions. Smokers who have never known someone to die of a smoking-related illness, for example, might underestimate the health risks of smoking. In contrast, if you have two sisters and five neighbors who have had breast cancer, you might believe it is even more common than statistics tell us.

10 Hero Images / Getty Images
Another cognitive bias that has its roots in the availability heuristic is known as the optimism bias. Essentially, we tend to be too optimistic for our own good. We overestimate the likelihood that good things will happen to us while underestimating the probability that negative events will impact our lives. We assume that events like divorce, job loss, illness, and death happen to other people.

So what impact does this sometimes unrealistic optimism really have on our lives? It can lead people to take health risks like smoking, eating poorly, or not wearing a seat belt.

The bad news is that research has found that this optimism bias is incredibly difficult to reduce. There is good news, however. This tendency toward optimism helps create a sense of anticipation for the future, giving people the hope and motivation they need to pursue their goals. So while cognitive biases can distort our thinking and ​sometimes lead to poor decisions, they are not always so bad.


Source:
Virginia Tech
Media Contacts:
John Pastor – Virginia Tech

Original Research: Closed access
“Confirmation bias in the utilization of others’ opinion strength”. P. Read Montague et al.
Nature Neuroscience doi:10.1038/s41593-019-0549-2.

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Questo sito usa Akismet per ridurre lo spam. Scopri come i tuoi dati vengono elaborati.