EnlargeBenno Hansen

What makes some people radical and prone to taking extreme views on topics? Radical, violent political movements have received a lot of attention in the news cycle, while non-violent radicalism is a significant impediment to the compromises that are necessary to build a functional society. At the same time some things we now take as accepted, like women having the right to vote or same-sex couples the right to marry, were once at the radical fringes of society. Given its importance for the evolution of societies, radicalism seems worth exploring.

One common feature of radicalism is a confidence in the rightness of your ideas, even if they go against those of society at large. So why do radicals have so much certainty? A new study pins the blame on a faulty metacognition, the process by which people recognize when their ideas might not be correct and update their beliefs accordingly.

Cognition, how meta

Our brains are not simply decision-making boxes. We're constantly evaluating how certain we are about our ideas, which can help us minimize risks—if we're not sure whether our opponent is bluffing, we're less likely to go all-in on a bet. Then, as more information becomes available, we'll generally re-evaluate our former beliefs. If we end up watching a player make a series of bluffs, then we'll include that information the next time we need to evaluate the probability.

These thought processes are termed "metacognition," or thinking about thinking. They're essential to keeping our behavior both flexible and reality-based. And, given that radicals have beliefs that tend to be associated with inflexible thinking, a team of researchers from University College London decided to see how metacognition works in individuals with radical political beliefs.

Of course, metacognition isn't the only mental process going on in people. There are various forms of motivated reasoning and cultural cognition, in which people protect themselves from ideas that run against their beliefs. To avoid complications like these and instead focus on metacognition, the researchers decided to use a simple task with no obvious cultural implications: estimate how many dots were in an area after being only shown a brief glimpse of it and describe how confident they were in their estimate.

The confidence estimate was there to force the participants to do a bit of metacognition by evaluating their feelings about their likely performance. And the effectiveness of their metacognition could be evaluated using the accuracy of their answers. To motivate people to make accurate assessments of their performance, financial rewards were given when their confidence matched their performance.

In a second experiment, the participants were asked to make the same estimate as in the first experiment but were then given some additional information about the dot density of the image. This allowed them the opportunity to update their confidence level based on new information. The work was done in the US and had an initial experiment of 381 people, along with a replication with another 417.

Radical identification

To spot the radicals, all participants were given a set of questions about their beliefs on a variety of topics, designed to get at things like their dogmatism, intolerance, and authoritarian tendencies. These were coupled with a few questions that determined if they were on the far left or right extreme. The results of their confidence in the dot estimating test were then compared to their tendencies toward radicalism.

Radicals, in general, performed pretty poorly when evaluating their own performance, with lots of misplaced confidence. "Dogmatic people manifest a lowered capacity to discriminate between their correct and incorrect decisions," is how the researchers put it. This was also true for authoritarians. Critically, the lowered capacity wasn't the product of a general overconfidence, since that trait had been examined in some of the survey questions.

The same issue was apparent in the tests in which participants were given additional information and then given a chance to revise their opinions. And, informatively, people with radical beliefs were less likely to update their confidence in response to the additional information, a feature that the authors consider a defect specific to metacognition.

The authors sum up their work by writing that "more radical participants displayed less insight into the correctness of their choices and reduced updating of their confidence when presented with post-decision evidence." That's a bit surprising, given that most research into this area has focused on the ideas themselves and suggested that confidence is simply a mechanism for protecting those ideas.

It's important not to think about this as a way to "fix" radicalism; as we noted above, radicals can perform an essential role in society. But it would be a positive to convince a subset of radicals that their ideas are not best promoted by using violence. And this study suggests that it's going to be a hard nut to crack, given how radicals don't tend to re-evaluate their own conclusions.

Current Biology, 2017. DOI: 10.1016/j.cub.2018.10.053 (About DOIs).

Original Article

[contf] [contfnew]

Ars Technica

[contfnewc] [contfnewc]