When people are very confident in a decision, they take the information that confirms their decision, but they don’t process the information that contradicts it, according to a brain imaging study at University College London (UK).
This study, which has been published in the scientific journal ‘Nature Communications‘, helps explain the neural processes that contribute to the confirmation bias rooted in most people’s thought processes.
“We were interested in the cognitive and neural mechanisms that cause people to ignore information that contradicts their beliefs, a phenomenon is known as confirmation bias,” explains Max Rollwage of University College London. “For example, skeptics of climate change may ignore the scientific evidence that indicates the existence of global warming.”
According to the researchers, “Although psychologists have long known this bias, the underlying mechanisms have not yet been understood. Our study found that our brains go blind to the contrary evidence when we are very confident, which may explain why we did not change our minds in light of the new information.”
For the study, 75 participants were asked to judge whether a point cloud moved to the left or right side of a computer screen and to give a confidence rating (how confident they were in their response), on a sliding scale of 50% safe to 100% safe. After this initial choice, the moving spots were shown to them again and they were asked to make a final decision.
The information became even clearer the second time and could help participants change their minds if they had made a mistake initially. However, when people were sure of their initial decision, they rarely used this new information to correct their mistakes.
People who had radical political views were not as good as moderate people in knowing when they were wrong
25 of the participants were also asked to complete the experiment on a magnetoencephalography brain scanner. The researchers monitored their brain activity while processing the movement of the stitches. Based on this brain activity, the researchers assessed the degree to which the participants processed the new information presented.
When people were unsure of their initial choice, they accurately integrated the new evidence. However, when the participants were very confident in their initial choice, their brains were practically blind to the information that contradicted their decision but remained sensitive to the information that confirmed their choice.
The researchers say that in real-world settings, where people are more motivated to uphold their beliefs, the effect may be even more pronounced. “Confirmation bias is often investigated in scenarios involving complex decisions on issues like politics,” Steve Fleming of University College London tells Neuroscience News.
“However, the complexity of such opinions makes it difficult to unravel the various factors that contribute to bias, such as the desire to maintain self-confidence with our friends or social group. By using simple perception tasks, we were able to minimize such motivational influences or social and specify the altered test processing factors that contribute to confirmation bias.”
The more radical, the worse
In a previous study, the research team had found that people who had radical political views, at either end of the political spectrum, were not as good as moderates at knowing when they were wrong, even about something that was unrelated with politics.
The researchers note that understanding the mechanism that causes confirmation bias can help develop interventions that could reduce people’s blindness to conflicting information. “The role of inaccurate trust in promoting confirmation bias indicates that empowering people to increase their self-awareness can help them make better decisions,” they conclude.