How COVID Brought Out the Worst in Us
COVID conspiracy theories, misinformation, and polarization.
There are two ways to be fooled. One is to believe what isn’t true; the other is to refuse to believe what is true.
~ Søren Kierkegaard
The COVID pandemic emerged at a time when it was becoming increasingly difficult to agree on shared facts and values. Early on, optimists suggested that the pandemic would be the common enemy to bring us together. Needless to say, this is not the rosy picture we ended up with. Instead, the pandemic catalyzed intergroup resentment on both sides of the political spectrum and fertilized the ground for conspiracy theories that were struggling for attention. In this piece, I will argue that our unproductive psychological response to COVID was caused mainly by the sustained deprivation of our basic human needs, e.g., the need for control, certainty, accompaniment, and purpose. I will also argue that when these psychological needs are not met, they can turn into feelings of impatience, anxiety, mistrust, and anger.
Lack of Control (Sect. 1 / 7)
Preserving people’s sense of control amid crisis is crucial. The COVID pandemic threw this fundamental human need into jeopardy. We lost our jobs, proximity to family, our day-to-day routines, and frankly, our minds. For many, these losses led to feelings of despair and hopelessness, and eventually resentment and mistrust (especially toward the mandates and CDC guidelines). It felt like the little autonomy we had was being taken away.
When we lose control over their environments, we look for small ways to regain it. Some of us did this by masking up, #flatteningthecurve, and getting vaccinated. Others did this by refusing to follow public health measures and attending superspreader events. Both approaches seem equally effective in restoring our sense of control. However we respond, it likely has little to do with our honest evaluation of all the available evidence and more to do with what our peers are doing.
As Joshua Greene points out in his book Moral Tribes, refusing to follow public health measures may be entirely rational, given that our goal is to maintain our standing within a group. If a threat is abstract enough to ignore - be it climate change, pandemics, or the threat of nuclear weapons - people will always side with their group over the truth. It is a rather simple mental calculation: being shunned by your group feels a lot worse than ignoring a few inconvenient facts. In the long run, though, this cost-benefit analysis becomes a tragedy of the commons. That is, it is entirely rational for each individual to act in their self-interest, even if it is detrimental to all individuals in the long run.
I think the cognitive algorithm at play here goes something like, “What can I do to stop a global pandemic? Nothing; What can I do to maintain status within my group? Parrot their views.” To be honest, the latter option sounds more appealing. We like to be able to control the things in our environment, so we avoid that which we cannot control and attend to the things we can. Maintaining our standing within our groups - which is one of the most important things to humans - is what we went with, whether we be anti-vax or CDC-compliant.
Uncertainty (Sect. 2 / 7)
Uncertain times invite speculation. When COVID hit, there were instantly multiple theories about how it spreads, how dangerous it was, to who it was a danger, and whether or not it leaked from a lab. The proliferation of these theories spread like wildfire, as people were desperately craving answers - answers that the authorities could not provide definitively.
Public health authorities only know the available evidence; conspiracy theorists and fake news outlets claim to know everything. Fake news and conspiracy theories are appealing because they give simple and intuitive ways to interpret otherwise ambiguous information. They are also used as a strategy to find meaning, order, and control in otherwise ambiguous and unstable events. This contributes to why false information spreads faster than the truth - it seems more useful.
Not only is the truth suppressed, but it is actively met with skepticism by anti-establishment figures and their followers. When the authorities said that the virus likely had a zoonotic origin in bats, the anti-establishment immediately looked for pseudo-evidence on the contrary. When scientists looked into the lab leak hypothesis and repeatedly found evidence against it (read this for a comprehensive review), it was viewed as a cover-up of what really happened. This is a common dynamic in the conspirasphere (coining that!) - conflicting evidence often gets folded into conspiracy theories and even gets used as further evidence for the conspiracy.
To further support their arguments, conspiracy theorists cherry-pick and collect anecdotes that give their theories an illusion of legitimacy. Take the CDC flipping their position on face masks, for example. At first, the CDC told us to wear face masks. Then we were told that masks increase how often we touch our face, so we were advised to not wear face masks. Eventually, the CDC realized that this was a mistake, and back to wearing masks it is. Though this was indeed a big mistake with harmful consequences, this level of uncertainty is to be expected in the event of a novel coronavirus. Conspiracy theorists and the right-wing media did not see it this way, so they attributed incompetence and malice to what could have been attributed to the novelty of the situation.
Too much time online (Sect. 3/ 7)
COVID, at least in the beginning, made us a whole lot more lonely. Many people coped with this social deficiency by spending more time on the internet. In fact, there was a 25% rise in internet activity within days of the lockdown. Much of this increase appeared on social media platforms, such as Facebook, Instagram, Twitter, and of course, TikTok. Though not all social media consumption is corrosive, it can be if our feeds are filled with alternative media, fake news, and Joe Rogan.
The very structure of the internet - where one link leads to another - can encourage us to consume unreliable news unimpeded for hours. To be sure, much of this excess media consumption is not paired with a desire for variety. This can form misleading or false narratives in the user’s head. Eventually, we become married to our narratives, and we support them with bits of unrelated information. With the internet’s endless supply of anecdotes, any claim can be made to seem quite robust. Their apparent robustness comes not from rigorous analysis, but their irresistible complexity. Most of us think it is reasonable to believe a story because of its specificity (i.e., “Who is going to make this stuff up?”), but specificity actually makes a claim less likely to be true. This is the conjunction fallacy in its worst form, encouraging us to adopt more extreme and ridiculous beliefs.
When we adopt extreme beliefs, we are also more likely to become immersed in radical communities. Without knowing it, our consumption of junk news traps us into echo chambers and filter bubbles. Our echo chambers have become so segregated that the spheres of orthodox science and pseudoscience now inhabit entirely separate corners of the internet. This means that it is unlikely we will encounter disconfirming evidence or even be introduced to a counternarrative. All that we are left with are reverberating voices affirming our own sentiments.
Filter bubbles add to the problem by curating content to increase our engagement. Another term for this effect is the rabbit hole phenomenon, and it can allegedly lead us to more extremist content. I say allegedly because it depends on the person. The innate tendency to seek out conspiracy-laden content is what matters here. In other words, people who believe in weird things were always interested in weird things. The internet just expedited their search.
Echo chambers and filter bubbles are not always filled with nonsense. Hypothetically, we could consume only reliable information and still end up with a skewed view of the world. We would continue to find some news stories more enticing than others. For example, terrible events are far more interesting than stories of progress. I wrote about this bias in a previous newsletter. It is much the same with misinformation and conspiracy theories. It is far more stimulating to read about microchipping vaccines than it is to read about the vaccine’s actual efficacy. Relatedly, unexpected news is far more alluring than expected news. Vaccine-related deaths (unexpected) are far more interesting to us than COVID-related deaths (expected). In other words, even if we started with rigorous and reliable journalism, our bias (i.e., the availability heuristic in particular) could still lead us to become misinformed.
COVID conspiracies and the power of sharing (Sect. 4/ 7)
Conspiracy theories flourished during the pandemic. One notable example is the rise of QAnon, which has been circulating since 2017 but skyrocketed in popularity during the pandemic. To illustrate, in the first week or so of the lockdown, the number of users engaged in QAnon discussions on Facebook and Twitter more than doubled. Between March 23 and 25, QAnon-related conversations soared by 422%. The link between COVID and QAnon is strong, as many of the QAnon-related content included hashtags such as #plandemic and #stopvaccination.
The real power lies in how easy it is to share content and how easy it is for that content to go viral. The masses can share anything at the click of a button without properly vetting the material (e.g., “I'll read the article later, I promise”). Multiple analyses have found that posts containing unreliable information about COVID-19 spread at least as quickly as reliably-sourced information about COVID-19. Indeed, the top 1% of false posts can easily reach 100,000 users, while the truth rarely reaches 1000. In addition, QAnon and other conspiracies capitalize on negative events, such as celebrities getting COVID, and by incorporating these events into their narratives.
To be clear, the problem is not that too many people are involved in creating new conspiracy theories, but that too many people are actively sharing the few currently in circulation. If everyone were interested only in creating their own narratives, none would get much traction. Indeed, a recent analysis found that a majority of social media users share content, rather than create it. This suggests that only a few misinformation entrepreneurs are needed to create a narrative, and the rest of the spread is done through sharing. Most of the QAnon conspiracies, for example, were created by one anonymous account on 4chan.
It doesn’t help that we get rewarded for sharing provocative content through the currency of likes, mentions, shares, and retweets. The content we share allows us to signal to others our interests, our identity tags, and our group membership. Phrases such as “covid is a hoax” and “plandemic” are used to show which side we are on, and needlessly derogatory phrases such as “Wuhan flu” and “China Virus” are used to show whose side we are not on. Sharing group-directed posts can form a feedback loop in which we are incentivized to share content that aligns with our community’s beliefs, and disincentivized to share content that differs from it.
Why the vaccine hesitancy? (Sect. 5 / 7)
Antivax rhetoric has existed for nearly as long as vaccinations themselves. The first vaccine (i.e., the smallpox vaccine) was developed in 1796 and became widely available in the early 1800s. The Anti-Vaccination Society of America was founded only a few decades later in 1879.
Today, anti-vaccination support is soaring, with accounts across the main social media platforms amassing a total of 58 million followers. The market for conspiracy-laden antivaccine content has been growing since the development of the COVID vaccine, with the number of followers of anti-vax content creators growing by a million on both Instagram and Facebook. Youtube was even worse, as it is usually the place where such information is platformed, growing by 5.8 million since the development of the vaccine. One study found that the most popular anti-vax accounts on Instagram showed a fivefold increase in their number of followers in 2020 alone.
Why the big numbers? The vaccine’s novelty was a major source of concern among the vaccine-hesitant. More specifically, concerns over future side effects are commonly reported. In this case, refusing or putting off the vaccine is a classic example of uncertainty avoidance. This means exactly what it sounds like - avoiding feelings of uncertainty - and is measured as “how threatened the culture’s members feel threatened by ambiguous or unknown situations.” Uncertainty avoidance is higher in people who are young, less-educated, lower-income, previously vaccine-avoidant, and overall less knowledgable about COVID.
Relatedly, as misinformation increases, vaccine hesitancy increases. For example, we are less likely to get the vaccine if we think the virus will simply die off with the heat. Donald Trump and other republican leaders led the effort to downplay the severity of the virus and in opposing mandates, vaccines, and other mitigation efforts. Religion too affects vaccination rates, as many religious fanatics believe that God will protect them against the virus. God is omnipotent, and science is often wrong, so it goes. This isn’t to put all the blame on religious conservatives. The heterodox-sphere and classical liberals also opposed many of these measures once absolute freedom of choice was revoked. All these factors together have contributed to vaccine hesitancy during, and before, this pandemic. The common theme here is misplaced trust in dubious sources and a lack of trust in the experts.
Trust (Sect. 6 / 7)
Trust is paramount in times of crisis. It is the very force that allows humans to cooperate in the first place. We listen to each other because we trust that others are trying to convey valuable information. We form groups and communities because we trust that our group members share common values. We grant power to our governments because we trust that they will serve our best interests.
The ultimate reason COVID was so difficult to recover from and to protect against was that we lost trust in each other and our institutions. This didn't happen everywhere, to be sure. Indeed, countries with higher trust in government experienced a lower spread of COVID and lower mortality rates. In contrast, lower trust in government is associated with conspiratorial thinking and the tendency to view every public health guideline as a step towards totalitarianism.
Governmental trust naturally fluctuates over time, with dips seen most in times of crisis. The general feeling expressed by the public is: "Why would I trust a dysfunctional government?" Disinformation campaigns, political polarization, and the coronacrisis (a bad pun, not a typo) have led us to an all-time low in governmental trust. According to Pew, governmental trust was at its high in 1965 (80%) and its low in 2019 (17%), with the trends appearing to be continuing downward.
Trust is the very basis of our politics, and it is usually the first to go in failing societies. Every society will inevitably split into separate ideological camps, and there will always exist a clash of competing interests. These ideological gaps can be bridged, though, by trusting that most people want what is best for society, regardless of their ideology. By working through our competing interests, we are able to find solutions that work for the majority of us, rather than just half of us. Disagreement, therefore, is not something to be merely tolerant of, but it is the ideal.
Lastly, knowledge itself is mostly, if not entirely, founded on trust. We build our knowledge in school because we trust our teachers are telling the truth. We trust in science because we trust that the scientists are honestly seeking the truth, and reporting their results accurately. To claim to know anything is to assume that others can share this knowledge if they just test the ideas for themselves.
What the pandemic of misinformation is costing us then, is our sense of what it means to know. When we lose this, conspiracy theories become just as good a theory as any. After all, it is very hard for the average citizen to go through and analyze all the available evidence for themselves. Therefore, there has to be a way to know that what we are hearing is true and not just an ideological talking point. Traditionally, it was trust that allow us to do this. Without trust, we are free to live in our bubbles, free to reconceptualize falsities as alternative facts, and free to, quite literally, live in a different world from our enemies.
Fixing the problem (Sect. 7/ 7 - Conclusion)
One of the biggest problems is that our institutions have not come up with ways to dampen the effects of social media or compete with alternative media outlets. Unfortunately, being a legitimate scientific institution requires that it be patiently cautious whilst implementing new guidelines, that it be hasty in correcting errors, and that it be transparent in its (un)certainty in the data. These are practices that the impatient and paranoid might not be willing to put up with. On the other hand, alternative media offers quick answers, doesn't have to own up to any of its mistakes, and is free to engage in motivated reasoning to maximize profits. In this sense, we are seeing asymmetric warfare between reliable information and misinformation. Conspiracy theorists speak with utter conviction, and scientists speak with intellectual humility. To balance the playing field, it is better to place more trust in explanations with less certainty.
The problem is that science doesn’t always live up to the ideals listed above. That is, when politics get involved, the experts don’t always signal their uncertainty, they stall their corrections, and they push out uninformed guidelines. Governments and public health authorities can build trust and restore a sense of control in their people by offering guidelines that are clear and in line with the available evidence. If the evidence is not available, then the authorities should communicate that.
This normalizes the standard that it is okay to be uncertain. If the scientific consensus is always honestly communicated - especially in cases where the evidence is lacking - then it becomes more trustworthy when we say that the “evidence is strong.” Additionally, accusations of incompetence or conspiracy lose some steam when the authorities have already communicated their level of certainty.
Some have blamed public health officials for unwittingly politicizing the pandemic (I think this is a stretch, but who cares what I think). Though mandates have been shown to increase the vaccination rate, they have also been shown to diminish trust, which may lead to decreased vaccination behavior in the future. Simply using the term “mandates” may have triggered paranoia in those who hold laissez-faire sacrosanct. Further, public health authorities decided to uphold the mask mandate for vaccinated individuals, which made the reward for getting the vaccine even more abstract (i.e., "well, at least now you will be more resilient against COVID!"). If our goal is to incentivize people to follow the guidelines, then we ought to reward them when they do so.
This is not to put all the blame on public health officials. I think even an impeccable response would have led to some conspiracism and anti-establishment paranoia. There always have been, and there always will be, anti-vaxxers and conspiracy theorists. Some of us are just too far gone to reach. My point is that the public health messaging didn’t do us any favors, and sometimes it played too easily into the hands of conspiracy theorists and the right-wing media.
As stated in my previous post, we take on conspiratorial beliefs as identity tags. In the case of vaccination, republicans may feel as though they shouldn't get vaccinated because no one in their group is getting vaccinated. Simply pointing out that 60% of US republicans have gotten at least one dose of the vaccine might weaken the effects of this group bias.
Our beliefs about covid are not primarily driven by motivated reasoning, but by cognitive laziness (i.e., a lack of reasoning). Therefore, encouraging reason and critical thinking may reduce rates of covid-related conspiracy beliefs. Simply encouraging people to evaluate the credibility of a conspiracy theory reduces their belief in it immediately afterward.
To that, I believe a proactive approach is the best option. One effective method of combatting conspiratorial beliefs is prebunking, wherein we provide counterarguments before one is exposed to misinformation. This has been met with more success than trying to debunk conspiratorial beliefs after they take hold, which can produce a backfire effect wherein we strengthen our belief in the conspiracy theory. Similarly, we can beat conspiracy theorists at their own game by exposing their manipulative persuasion tactics. I’ll list a few below:
establishing in-group/out-group identities;
adopting the talking style of scientists to push pseudoscience;
offering too-good-to-be-true solutions;
relating to their followers' feelings of uncertainty and impotence;
grifting.
Admittedly, prebunking can be hard to implement because, as they say, a lie spreads around the entire world before the truth has a chance to get its boots on. It is hard to prebunk conspiracy theories until they become known, at which point they already have a certain amount of support behind them.
Whichever strategy we use, we must address the basic human needs of everyone, especially in times of crisis. A society cannot function properly otherwise. There will always be those who we cannot reach, but fence-sitters will always outnumber them. The best strategy is to target these folks, whether it be by prebunking, restoring their sense of control, communicating openly, or restoring their trust. Though public health professionals are uniquely positioned to address these fence-sitters, it is everyone’s responsibility to protect those in their community, and to speak up against obvious falsities with compassion and understanding.
COVID was kind of a fail, but we will get the next one. I believe in us.
Twitter: @RyanBruno7287
References:
Cinelli, M., Quattrociocchi, W., Galeazzi, A., Valensise, C. M., Brugnoli, E., Schmidt, A. L., ... & Scala, A. (2020). The COVID-19 social media infodemic. Scientific reports, 10(1), 1-10.
Del Vicario, M., Bessi, A., Zollo, F., Petroni, F., Scala, A., Caldarelli, G., ... & Quattrociocchi, W. (2016). The spreading of misinformation online. Proceedings of the National Academy of Sciences, 113(3), 554-559.
Douglas, K. M. (2021). COVID-19 conspiracy theories. Group Processes & Intergroup Relations, 24(2), 270–275. https://doi.org/10.1177/1368430220982068
Dow, B. J., Johnson, A. L., Wang, C. S., Whitson, J., & Menon, T. (2021). The COVID‐19 pandemic and the search for structure: Social media and conspiracy theories. Social and Personality Psychology Compass, 15(9). https://doi.org/10.1111/spc3.12636
Dow, B. J., Wang, C. S., Whitson, J. A., & Deng, Y. (2022). Mitigating and managing COVID-19 conspiratorial beliefs. BMJ Leader, leader-2022-000600. https://doi.org/10.1136/leader-2022-000600
Fisher, K. A., Bloomstone, S. J., Walder, J., Crawford, S., Fouayzi, H., & Mazor, K. M. (2020). Attitudes toward a potential SARS-CoV-2 vaccine: a survey of US adults. Annals of internal medicine, 173(12), 964-973.
Frenkel, S., Decker, B., & Alba, D. (2020). How the ‘Plandemic’movie and its falsehoods spread widely online. The New York Times, 20.
Garry, J., Ford, R., & Johns, R. (2020). Coronavirus conspiracy beliefs, mistrust, and compliance: taking measurement seriously. Psychological medicine, 1-11.
Greene, J. (2014). Moral tribes: Emotion, reason, and the gap between us and them. Penguin.
Hofstede, G., Hofstede, G. J., & Minkov, M. (2005). Cultures and organizations: Software of the mind (Vol. 2). New York: Mcgraw-hill.
Hornsey, M. J., Chapman, C. M., Alvarez, B., Bentley, S., Salvador Casara, B. G., Crimston, C. R., Ionescu, O., Krug, H., Preya Selvanathan, H., Steffens, N. K., & Jetten, J. (2021). To what extent are conspiracy theorists concerned for self versus others? A COVID‐19 test case. European Journal of Social Psychology, 51(2), 285–293. https://doi.org/10.1002/ejsp.2737
Jolley, D., & Douglas, K. M. (2017). Prevention is better than cure: Addressing anti‐vaccine conspiracy theories. Journal of Applied Social Psychology, 47(8), 459-469.
Hudecek, M., Fischer, P., Gaube, S., & Lermer, E. (2022). Who Thinks COVID-19 is a Hoax? Psychological Correlates of Beliefs in Conspiracy Theories and Attitudes Towards Anti-Coronavirus Measures at the End of the First Lockdown in Germany. Journal of Open Psychology Data, 10, 9. https://doi.org/10.5334/jopd.64
Karafillakis, E., & Larson, H. J. (2017). The benefit of the doubt or doubts over benefits? A systematic literature review of perceived risks of vaccines in European populations. Vaccine, 35(37), 4840-4850.
Kużelewska, E., & Tomaszuk, M. (2022). Rise of Conspiracy Theories in the Pandemic Times. International Journal for the Semiotics of Law - Revue Internationale de Sémiotique Juridique. https://doi.org/10.1007/s11196-022-09910-9
Landau, M. J., Kay, A. C., & Whitson, J. A. (2015). Compensatory control and the appeal of a structured world. Psychological bulletin, 141(3), 694.
Loomba, S., de Figueiredo, A., Piatek, S. J., de Graaf, K., & Larson, H. J. (2021). Measuring the impact of COVID-19 vaccine misinformation on vaccination intent in the UK and USA. Nature human behaviour, 5(3), 337-348.
Lu, J. G. (2022). Two large-scale global studies on COVID-19 vaccine hesitancy over time: Culture, uncertainty avoidance, and vaccine side-effect concerns. Journal of Personality and Social Psychology. https://doi.org/10.1037/pspa0000320
Maguire, A., Persson, E., Västfjäll, D., & Tinghög, G. (2022). COVID-19 and Politically Motivated Reasoning. Medical Decision Making, 0272989X2211180. https://doi.org/10.1177/0272989X221118078
Ngai, C. S. B., Singh, R. G., & Yao, L. (2022). Impact of COVID-19 Vaccine Misinformation on Social Media Virality: Content Analysis of Message Themes and Writing Strategies. Journal of Medical Internet Research, 24(7), e37806. https://doi.org/10.2196/37806
Ohme, J. (2021). Algorithmic social media use and its relationship to attitude reinforcement and issue-specific political participation–The case of the 2015 European immigration movements. Journal of Information Technology & Politics, 18(1), 36-54.
Omidvar Tehrani, S., & Perkins, D. D. (2022). Community Health Resources, Globalization, Trust in Science, and Voting as Predictors of COVID-19 Vaccination Rates: A Global Study with Implications for Vaccine Adherence. Vaccines, 10(8), 1343. https://doi.org/10.3390/vaccines10081343
Paul, E., Steptoe, A., & Fancourt, D. (2021). Attitudes towards vaccines and intention to vaccinate against COVID-19: Implications for public health communications. The Lancet Regional Health-Europe, 1, 100012.
Pires, C. (2022). Global Predictors of COVID-19 Vaccine Hesitancy: A Systematic Review. Vaccines, 10(8), 1349. https://doi.org/10.3390/vaccines10081349
Robertson, E., Reeve, K. S., Niedzwiedz, C. L., Moore, J., Blake, M., Green, M., ... & Benzeval, M. J. (2021). Predictors of COVID-19 vaccine hesitancy in the UK household longitudinal study. Brain, behavior, and immunity, 94, 41-50.
Roozenbeek, J., & Van der Linden, S. (2019). Fake news game confers psychological resistance against online misinformation. Palgrave Communications, 5(1), 1-10.
Schmidtke, K. A., Kudrna, L., Noufaily, A., Stallard, N., Skrybant, M., Russell, S., & Clarke, A. (2022). Evaluating the relationship between moral values and vaccine hesitancy in Great Britain during the COVID-19 pandemic: A cross-sectional survey. Social Science & Medicine, 308, 115218. https://doi.org/10.1016/j.socscimed.2022.115218
Schneider, C. R., Freeman, A. L. J., & Spiegelhalter, D. (2022). The effects of communicating scientific uncertainty on trust and decision making in a public health context. Judgment and Decision Making, 17(4), 34.
Schwarzinger, M., Watson, V., Arwidson, P., Alla, F., & Luchini, S. (2021). COVID-19 vaccine hesitancy in a representative working-age population in France: a survey experiment based on vaccine characteristics. The Lancet Public Health, 6(4), e210-e221.
Simione, L., Vagni, M., Gnagnarella, C., Bersani, G., & Pajardi, D. (2021). Mistrust and Beliefs in Conspiracy Theories Differently Mediate the Effects of Psychological Factors on Propensity for COVID-19 Vaccine. Frontiers in Psychology, 12, 683684. https://doi.org/10.3389/fpsyg.2021.683684
Ståhl, T., & Van Prooijen, J. W. (2018). Epistemic rationality: Skepticism toward unfounded beliefs requires sufficient cognitive ability and motivation to be rational. Personality and Individual Differences, 122, 155-163.
Swami, V., Barron, D., Weis, L., Voracek, M., Stieger, S., & Furnham, A. (2017). An examination of the factorial and convergent validity of four measures of conspiracist ideation, with recommendations for researchers. PloS one, 12(2), e0172617.
Viskupič, F., & Wiltse, D. L. (2022). Political Partisanship and Trust in Government Predict Popular Support for COVID-19 Vaccine Mandates for Various Professions and Demographic Groups: A Research Note. American Politics Research, 1532673X2211188. https://doi.org/10.1177/1532673X221118888
Vosoughi, S., Roy, D., & Aral, S. (2018). The spread of true and false news online. science, 359(6380), 1146-1151.
Weigmann, K. (2018). The genesis of a conspiracy theory: Why do people believe in scientific conspiracy theories and how do they spread?. EMBO reports, 19(4), e45935.