When the coronavirus first began to sweep across the United States, one of the biggest arguments was over face masks. Should you wear one — or not? Some people said masks had no benefit. Others said they could stop most germs.
Now, doctors and scientists around the world recommend wearing a mask. Sometimes the benefit might be small, but every little bit helps. But by the time this became the common medical view it was too late. Many people had decided they would never wear a mask.
“Initially people were told wearing a mask was not helpful,” says Jonas Kaplan. “Then we got new information. But for many people that initial belief sticks and it’s hard to change.” Kaplan is a cognitive neuroscientist, someone who studies how thinking takes place in the brain. He works at the University of Southern California in Los Angeles.
Our brains are very susceptible to what is known as confirmation bias, he says. This is the tendency to search for and believe information that agrees with what you already accept — and to walk away from information that shows you might be wrong. People who thought we shouldn’t wear masks continued to seek out information that said masks were no good or even harmful. They ignored information that showed masks could help.
Such behavior has its basis in the brain.
When we are confident about something, our brains are sensitive to someone else’s idea only when it agrees with ideas we already hold, a new study shows. A second study finds that the more confident you are in yourself, the more your brain focuses on information that agrees with your views — and sweeps opposing ideas out the door.
These new studies help to show why it’s so hard to change our minds. But if we understand this risk, we also stand a better chance of overcoming it.
I have confidence in me
Read Montague has always been interested in how other people affect someone’s decisions. He works at Virginia Tech in Roanoke where he studies computational neuroscience. He uses computers to help understand the brain. He is part of a team that wanted to see how confidence in what we believe about something might be swayed by someone else.
The scientists asked 42 people to come into the lab in pairs. Each pair was introduced. Then the two people were placed in separate rooms. There, each viewed 175 photos of houses. They were asked how much they thought each house cost. Then they got to make a small bet about the price of each house. The size of the bet showed how confident they were in assessing a home’s price. If the bet was high, the participants were probably very sure about the home’s price.
Then, the participants on each team got to review their bets. They also saw how much their partner had bet on each house. The first person could take that information and change the initial bets. But if they were wrong, they’d lose the money they wagered.
While the participants were adjusting their bets, they lay inside a machine that performs functional magnetic resonance imaging, or fMRI. This device measures blood flow to very specific areas of the brain. Usually, scientists interpret more blood flow to a brain region as evidence that it is more active.
The scientists focused on an area known as the mPFC. That’s short for posterior medial prefrontal cortex (Paw-STEER-ee-er MEE-dee-ul Pree-FRON-tul KOR-tex). The prefrontal cortex is right behind the forehead. It plays an important role in decision making. Posterior means behind, and medial means middle. So the mPFC is in the middle back of this brain region. This brain region plays a role in tracking the results of decisions.
In the study, when someone’s partner agreed with a wager, that first person often would increase the size of his wager. The partner’s agreement seemed to reinforce the person’s initial decision. But if the partner disagreed, the first person usually decreased his wager. He became a little less confident.
And here is where confirmation bias showed up.
People tended to increase their wagers when their partners agreed with them by a lot more than they reduced their bet when a partner disagreed. In other words, when the partner had the same opinion, the first person acted on that information. When the partner had a different opinion, the first person was more likely to ignore that information. They’d lower their bet by just a little, to be safe. But they wouldn’t really reconsider it altogether.
That confirmation bias also was reflected in the brain. The mPFC became less active when a partner agreed. It remained the same, however, when a partner disagreed. This meant that the mPFC was less sensitive to someone else’s opinion only when it disagreed with their own. “It has a big impact when it’s confirming,” says Montague, “and it has negligible impact when it’s disconfirming.”
Montague’s team published its findings in the January 2020 Nature Neuroscience.
“The [mPFC] activity tracks quite nicely with the confidence that people are having in their judgement,” says Alice Atkin. She’s a cognitive neuroscientist who did not take part in the study. She works at the University of Alberta in Edmonton, Canada. The new findings, she says, show that when someone goes to check their bets, their brain is already filtering out information that might show they are wrong.
What is making the brain ignore what it doesn’t want to hear? The answer might be the confidence itself.
Confidence breeds more confidence
Max Rollwage has always been fascinated by confirmation bias. “People seem to have this blind spot of not being willing — or not being able — to take information into account that goes against their beliefs.” he says. Rollwage studies neuroscience, or how the brain works, at University College London in England.
He is part of a team that wanted to understand how confidence in one’s opinions affects confirmation bias. To study this, the team had people watch moving dots. Some dots moved left, some moved right. Were there more dots moving to the right? Or the left? The person watching had to decide.
Then, the researchers messed with the recruits’ confidence by upping the number of dots moving — in both directions. “If you increase the number of dots moving in the targeted direction,” he says, “and at the same time let some more move the opposite way, people don’t perform better.” But, he notes, they do get “a stronger feeling of confidence.”
The scientists now asked the participants how confident they were in their choices. Then they showed the dots again. This time it was much more obvious if the dots were going right or left. The researchers asked the participants if they wanted to change their decision this second time around.
Because emotions don’t get involved, this task is a great way to assess confirmation bias, says Kaplan, who was not involved in the study. After all, no one is going to hold a passionate opinion about dots moving on a screen.
Twenty-five participants viewed the dots while wearing a cap with electrodes stuck onto their heads. This was to get a recording known as an MEG. That’s short for magnetoencephalogram (Mag-NEE-toh-en-SEF-uh-laah-gram). It records magnetic fields in the brain. “A neuron [brain cell] is like a little wire that sends electricity,” Rollwage explains. The tiny electrical signals “create a magnetic field around the wire.” So whenever a neuron fires, a tiny magnetic field builds up around the neuron.
If many neurons fire at once, that magnetic field will be big enough to show up on an MEG, measuring brain activity. A MEG cannot pinpoint the exact brain regions that turn on, as fMRI does. But MEGs can show results very quickly, as people make up their minds.
In this study, the amount of brain activity depended on someone’s confidence level. People who were not confident in their dot decision took in more information the next time they were presented with the dots. But when a person was confident she had read the dots correctly, her brain incorporated data on the second set of dots only if it agreed with her initial decision. If the dots disagreed with that decision, her brain didn’t put in an effort.
“What we found is when you are not very confident, you’re unbiased … you incorporate information objectively,” says Rollwage. However, he observes, “If you’re confident, then your brain shows a really strong confirmation bias.” He says that means “you’re basically not even processing any information that goes against your beliefs.”
Rollwage and his colleagues published their results May 26 in Nature Communications.
Fighting this bias with knowledge
The first paper “suggests there’s a filter in the brain,” Aktin says. “Before you reaffirm your judgement, it’s already filtering this information.” The second paper “suggests it‘s confidence itself acting as the filter.” Confidence leads the brain to confirmation bias.
The two papers “speak to each other,” she says. “I think it’s a good example of how there’s never going to be one study that tells you everything you want to know.” Combining findings from several studies that examine the same topic in different ways can give a better picture of how confirmation bias works, she says.
This type of bias isn’t always a bad thing, says Kaplan at the University of Southern California. “I think there are reasons,” he says, “why the brain is set up to favor our current beliefs over new beliefs.” It’s good to be confident if you’re right. A problem emerges only when you’re confident but wrong.
But, Kaplan notes, whether we’re right or wrong, “The internet is set up to feed our confirmation bias.” It keeps supplying at least some information “that’s similar to what we already believe.” Our friends aren’t any help, either. “In social media, [we] tend to connect with people who share our beliefs,” he notes. We can easily discount information that we don’t agree with — and find more information we want to believe.
The trick to fighting this bias, Montague says, it to recognize it. Realize that you are looking for things that reinforce what you already believe. Try talking to someone who doesn’t agree with you. Listen. Then, he says, wait and evaluate what you heard. “Realize that what you think in your heart of hearts might not be the best answer.”
It’s hard work, he admits. And what works for one person may not work for another. Keep in mind, he says, “Biases are built in deep.” In fact, he adds, “They’re instinctual and hard to detect.”