Article Review: "Countering Conspiracy Theory Beliefs: Understanding the conjunction fallacy and considering disconfirming evidence"
Conspiracy theories are quickly becoming one of the most studied forms of unfounded beliefs. This is in large part due to how politically consequential they have become. The January 6 insurrection was organized around the conspiracy theory that the 2020 election was stolen. They have also gained a lot of attention because, despite their superficial implausibility, about half of the country believes one or more of them. Thus, academics around the world have been trying to understand the whys, whos, and hows of conspiracy beliefs, and how such beliefs can be undermined before they take hold.
Last year, Lindsay Stall and John Petrocelli published a study that suggests that training people to recognize fallacious arguments reduces uptake in novel conspiracy theories. Specifically, they focused on the conjunction fallacy, or the tendency to believe that two events occurring at the same time are more likely than just one of those events occurring.
A common illustration of this effect concerns Linda, a student who is deeply concerned with issues of discrimination and social justice. When given a description of Linda, people will quite confidently claim that she is more likely to be a feminist bank teller rather than just a bank teller (which includes feminist bank tellers).
Previous studies have shown that those who are more likely to make conjunction errors are also more likely to believe in conspiracy theories, perhaps pointing to an underlying inclination that doesn't prioritize critical thinking. This might explain the conspiracist’s tendency to neglect the fact that each time they add a new component to their conspiracy narratives, the less likely they are to be true. I haven’t met many Satanists in my day, so there must be even fewer democratic, cannibalistic, pedophilic, Satanists. Adherents of QAnon claim that these people are everywhere.
Or consider a scenario proposed in Stall and Petrocelli’s paper:
“Following the 2020 presidential election won by Joe Biden, many people claimed that incumbent nominee, Donald Trump, was somehow cheated and the fraudulent outcome of the election was being upheld. Of course, the upholding of a fraudulent outcome in favor of Joe Biden would have required mass collusion by voters, poll workers, media, Bill Barr (former Attorney General), election security, and all courts including the supreme court. Furthermore, conspiracy theorists would need reason to trust the leading individual promoting the allegations of a conspiracy—notorious for lying and bullshitting.”
The conspiracists’ neglect of increasingly unlikely narratives isn’t subtle. The authors first sought to establish this link between conspiracy theory beliefs and the propensity to make conjunction errors, as well as the failure to consider disconfirming arguments for their beliefs. They then investigated whether conspiracy beliefs could be reduced through an educational module that reduced conjunction errors and promoted criticism of one’s beliefs.
Methods
They first presented participants with general details about a popular Mexican energy drink, ¡Arriba!. They then gauged participants’ agreement with hearsay statements such as “Some people are saying that ¡Arriba! contains illegal substances that raise the desire for the product; Some people are saying that about 5 years ago, a man died of cerebral hemorrhage, caused by overly high consumption of ¡Arriba!.”
Participants then received conjunction fallacy training. They were presented with scenarios like that of Linda, who I mentioned earlier. After their responses, they received feedback and an explanation of why their answer was right or wrong.
Finally, they asked the participants to consider their beliefs about whether the ¡Arriba! company was engaged in corrupt practices. Participants then provided reasons that both confirmed and (importantly) disconfirmed their beliefs.
Results
“Combating flawed reasoning and misinformation is most effective with a mixed approach.”
The results showed that beliefs in a novel conspiracy theory are associated with both the propensity to commit conjunctive errors and a propensity to ignore disconfirming evidence. No shocker there.
Their more interesting finding was that, though conjunction fallacy training improves participants' statistical reasoning skills, it wasn’t sufficient in reducing novel conspiracy beliefs alone, nor was the disconfirming inquiry. The greatest effect was seen when both of these approaches were combined.
Given the training alone, participants were unable to practice their new statistical reasoning skills to consider more plausible explanations.
Given the disconfirming inquiry alone, participants were left struggling to generate disconfirming thoughts, much less ones they found convincing enough to update their beliefs.
Given both exercises, participants were better able to apply their statistical reasoning to generate disconfirming explanations of each conspiracy theory. It also gave participants more opportunities to think critically about the accuracy of their beliefs.
Conclusion
There is a darker interpretation of these results, however. For one, it showed how easily we can “jump to conclusions” with limited evidence. One of the biggest contributors to conspiracy beliefs is incomplete information. Humans have an irresistible urge to “fill in the gaps”, leading to all sorts of wacky beliefs.
Another pessimistic takeaway is that no one approach, such as debunking or prebunking, is sufficient to extinguish these wacky beliefs. This study suggests that combating flawed reasoning and misinformation is most effective with a mixed approach. The intervention used in this paper was strong because it allowed people to train, then practice.
It will be interesting to see what reasoning training future studies will come up with. For example, there are myriad cognitive fallacies, many of which are pertinent to conspiracy beliefs. How might conspiracy beliefs be influenced by in-group-out-group bias training? How can we modify powerful existing strategies, such as pre-bunking, to combat misinformation?
Identifying effective interventions is still in its early stages, but this study gives us reason to be hopeful that we are headed in the right direction.
Social
Twitter: @RyanBruno7287
Mastodon: https://mstdn.social/@ryanbruno