Conspiracy Theories and Partisans
Every opinion is a marriage of information and predisposition.
~ John R. Zaller, 1992
As stated in my previous post, conspiratorial ideation (CI) is mainly a practice of motivated reasoning. Though I focused narrowly on motivators related to self-interest, CI can arise from group-related bias as well. That is, we don’t just conjure conspiracy theories for our own benefit, but for the benefit of our group. In moderation, group bias can be quite anodyne (e.g., sports banter). In extremis, it leads to profound polarization, echo chambers, and sometimes, conspiracy theories (CTs).
An Easy Example: Why Don’t We See QAnon Signs at Biden Rallies?
One way of answering this is to consider who QAnon CTs might attract (not a trick question): people who really hate democrats. For those unfamiliar, QAnon claims that the democratic party is full of satanic-worshipping, cannibalistic pedophiles. It is hard to draft a conspiracy theory with a partisan bias this clear and hyperbolic. It is not difficult to see that QAnon was never based on a scrupulous interpretation of all the available evidence, but on deep partisanship and a morbid disgust for the democrats. This last point is important, as it is the truest part about QAnon. Though the content is preposterous, it feels real to its believers. It feels real because it is the only description of the democratic party that accurately captures their horribleness.
QAnon selects only for the most devoted right-wing extremists by:
providing its adherents a label to organize themselves around;
providing a difficult-to-believe narrative that acts both as a symbol of one’s group loyalty and as a tool for excluding those who are less devoted;
providing a narrative that accurately characterizes its adherents’ hatred for the democratic party;
neatly splitting the world into forces of good and evil. We like to perceive our own side as virtuous and ethical, and our opposing side as immoral and fraudulent.
QAnon may seem like an extreme example, and that it is, but it also magnifies the thinking errors we all make regularly. Stated plainly, our general political worldview will influence the more specific beliefs we hold, which may lead to biased reasoning and thus false conclusions. As social creatures, our worldview often comes from the environment we find ourselves in, or from the environment we would like to see ourselves in. This is why our policy decisions are more based on what our side deems to be morally correct - or on the ideology of the politician who proposed it- than on the policy’s actual content. This group bias - or group symbol bias– is what motivates people to devise CTs that accuse political opponents of malpractice.
My-Side Bias
We don’t endorse CTs because of their plausibility, but because they confirm or exaggerate the beliefs and attitudes that we already hold. A term for this concept is the ‘my-side bias’, which is often blamed for increasing levels of political division. I am, of course, referring to Keith Stanovich’s book The Bias that Divides Us, where he argues that we don’t live in a post-truth society per se. That would require that people generally don’t give a damn about the truth. This might be true, but it may be more appropriate to say that we live in a “my-side society” where everyone seeks only the information that confirms their priors. The difference between a “post-truth society” and a “my-side society” is that the latter still values evidence, it just fails to weigh it properly.
The my-side bias makes it difficult to hear out the other side of an argument, and makes it all too tempting to accept confirmatory evidence. For instance, Republicans are far more likely than Democrats to believe CTs about democratic politicians, and vice versa. Similarly, 9/11 truther conspiracies are held chiefly by democrats (Bush, a Republican, was the president at the time). It is odd that these conspiracy theories seem so plausible to their adherents. To believe that 9/11 was an inside job, one must ignore a mountain of evidence, and remain convinced by nothing but speculation. Not only that, but the CT becomes even less likely when we consider the U.S. government’s notorious tendency to spill the beans.
The Perils of Motivated Reasoning
There is some evidence that increased reasoning cannot save us from group bias. Some studies have found that for partisans, increased reasoning just leads to more bias. This seems paradoxical. Shouldn’t reasoning make us more reasonable? It does, in a way, but mostly in the service of finding reasons that jive with our prior leanings.
Interestingly, the more reasonable you are, the more competent you are at post hoc reasoning, i.e., finding reasons to justify an already decided conclusion. Pennycook et al. found that those who are more inclined to engage in reasoning are more likely to be politically polarized. It seems that David Hume was right when he said that reason is a “slave of the passions”.
Conclusion
So, conspiracy theories are for the gullible and for the intelligent? May as well pack our bags and let humanity destroy itself – well, not so fast. One benefit of increased polarization (hm, maybe the only benefit?) is that it makes it less likely that an entire society will believe any given CT, considering that CTs largely lean to one extreme or the other. There will always be skepticism in the world (a good thing), even if each individual is applying it selectively (something to work on). Maybe this is how societies should operate. Imagine if an entire society were to engage in groupthink, with no one to challenge their views. Societies that operate in this way tend to stagnate, become progressophobic, and leave the very concept of truth susceptible to manipulation – George Orwell’s worst nightmare. I am not naïve enough to claim our current conspiracy-laden and polarized political discourse is just a liberal society functioning properly. Our current situation is bad. CTs are becoming more widespread (especially in the age of COVID) and are even supported by U.S. representatives (e.g., Marjorie Taylor Greene). With that said, extreme communities will always exist in a society that prizes ideological diversity. Our main challenge is to keep these communities marginal and to make it easier to find the truth for those who seek it.
Twitter: @RyanBruno7287
References
Carlin, R. E., & Love, G. J. (2018). Political competition, partisanship and interpersonal trust in electoral democracies. British Journal of Political Science, 48(1), 115-139.
Cohen, G. L. (2003). Party over policy: The dominating impact of group influence on political beliefs. Journal of personality and social psychology, 85(5), 808.
Einstein, K. L., & Glick, D. M. (2015). Do I think BLS data are BS? The consequences of conspiracy theories. Political Behavior, 37(3), 679-701.
Imhoff, R., Zimmer, F., Klein, O., António, J. H., Babinska, M., Bangerter, A., ... & Van Prooijen, J. W. (2022). Conspiracy mentality and political orientation across 26 countries. Nature human behaviour, 6(3), 392-403.
Miller, J. M., Saunders, K. L., & Farhart, C. E. (2016). Conspiracy endorsement as motivated reasoning: The moderating roles of political knowledge and trust. American Journal of Political Science, 60(4), 824-844.
Pennycook, G., Fugelsang, J. A., & Koehler, D. J. (2015). Everyday consequences of analytic thinking. Current directions in psychological science, 24(6), 425-432.
Pennycook, G., Cheyne, J. A., Koehler, D. J., & Fugelsang, J. A. (2020). On the belief that beliefs should change according to evidence: Implications for conspiratorial, moral, paranormal, political, religious, and science beliefs. Judgment & Decision Making, 15(4).
Stanovich, K. E. (2021). The bias that divides us: The science and politics of myside thinking. MIT Press.
Uscinski, J. E., Klofstad, C., & Atkinson, M. D. (2016). What drives conspiratorial beliefs? The role of informational cues and predispositions. Political Research Quarterly, 69(1), 57-71.