Making Sense of Conspiracy Theories
Share
- Details
- Text
- Audio
- Downloads
- Extra Reading
With Brexit, the US presidential election and the Covid pandemic, conspiracy theories now seem to be everywhere. It’s commonly argued that the internet has fuelled their popularity, leading to a loss of faith in mainstream media, science, democracy and even truth itself. But what if the rise of conspiracy theories is a symptom rather than the cause of a collapse of trust in civic institutions?
Download Text
Making Sense of Conspiracy Theories
Professor Peter Knight
14 November 2024
Introduction
It seems that conspiracy theories are everywhere now, especially in the online world. Many recent mass shootings, for example, have referenced the Great Replacement, the conspiracy theory that the government is deliberately increasing immigration to undermine white, Christian identity. Then, there is QAnon, the speculation that there is a vast, Satanic cabal of paedophiles among the political and Hollywood elite. Many people claimed that the Covid pandemic had been secretly planned in advance to bring about population control or to implant microchips via the vaccine. Others are convinced that the World Economic Forum’s proposal for a Great Reset is actually a plan to implement totalitarian control of the world’s population through central bank digital currencies and digital IDs. And then, of course, there’s the fact that the majority of Trump supporters in the US are convinced that the 2020 presidential election was rigged, leading some of them to storm the Capitol—and they feel the same way about this year’s election (this text was prepared the day before the election).
There’s no denying that conspiracy theories are a very visible and troubling presence in many countries around the world today. Most commentators start from the assumption that the internet has created an unprecedented explosion of conspiracy theories and related forms of misinformation, disinformation, and fake news. The claim is that conspiracy theories threaten to undermine trust in impartial media, objective science and even democracy itself. However, I think there are many mistaken assumptions at work in much discussion of conspiracy theories. In this lecture I’m going to look at six common misperceptions. Correcting these misperceptions matters because, if we fail to understand how and why conspiracy theories appeal to so many people today, any efforts to combat them are likely to fail. In short—and in danger of sounding like a conspiracy theorist myself—I’m going to try and convince you that everything you thought you knew about conspiracy theories is a lie.
Before I begin, two quick clarifications. First, most of the examples I’m going to consider are from the US. That’s not because Americans are uniquely prone to believing in conspiracy theories. Far from it. Conspiracy theories are common in many Western democratic countries, including the UK; and it’s arguable that they are even more consequential in many authoritarian regimes around the world today (Walter and Drochon 2020). The reason for focusing on the US is simply because I’m a professor of American Studies, and this is the context I know best. Second, my focus is specifically on conspiracy theories, but much of the recent discussion is concerned more broadly with misinformation (false or misleading information that is spread unwittingly) and disinformation (false or misleading information that is spread deliberately by those who know it to be false). However, in most cases the examples of mis- and disinformation tend to be conspiracy theories, and often the kind of dramatic examples I outlined just now.
Myth #1
“This is the golden age of conspiracy theories. The internet has caused an unprecedented explosion in the creation, circulation and consumption of conspiracy thinking.”
In January 2024 the World Economic Forum (WEF) placed misinformation and disinformation at the top of its Global Risk Index list over the next two years, suggesting that they are more of a risk to global business than the threat of another pandemic, or the expansion of the war in Ukraine, or the climate emergency. In their defence, the WEF would argue that each of these other crises will be made much worse by the lies and conspiracy theories surrounding them. Then again, a cynic might suggest that the reason the WEF thinks the threat of misinformation is the number one priority is because the Great Reset—their much-vaunted blueprint for a post-pandemic rethinking of global capitalism—spectacularly misfired, as it became the focus for a lot of current conspiracy talk.
But the WEF is not alone in claiming that our current moment is the Golden Age of conspiracy theories. Many commentators are convinced that there has been an explosion in misinformation and conspiracy theories in recent years, beginning with Brexit and Trump in 2016. But there are good reasons to take these claims with a pinch of salt. First, conspiracy theories have a long history. They can be found in ancient Greece and Rome, and a recognisably modern form of conspiracy thinking dates to the French Revolution, if not before (Butter and Knight 2020). Before the middle of the twentieth century, understanding historical events through the lens of a conspiracy theory was not only widespread but was a mark of sophisticated political analysis (Wood 1982). In the eighteenth and nineteenth century, many influential mainstream politicians in the US openly embraced conspiracy theory interpretations (Butter 2014). It was only in the middle of the twentieth century that conspiracy theories came to be stigmatised as a suspect form of knowledge (Thalmann 2019). Indeed, the very term “conspiracy theory” is a recent coinage (it only gets added to the OED in the late 1990s). The idea of conspiracy theory as a mistaken and dangerous way of understanding the world was taken up by a number of American and European historians, sociologists and psychologists working in the 1950s and 1960s. They were trying to explain the rise of the mass political movements that had swept through Europe in the 1930s, and they were concerned that the “paranoid style” of politics might make a return in the postwar period (Hofstadter 1964). The term “conspiracy theory” from the outset was therefore always more than a neutral label. It was meant to be pejorative, designating a pathological and dangerous tendency in politics that needed to be contained. Our current moment of conspiracy-infused, right-wing populism might present an alarming threat to democratic institutions, but it’s not unprecedented—we’ve been here before.
Second, current levels of belief in conspiracy theories are not necessarily higher than in previous historical moments. During the pandemic, for example, as many as 30% of people in the US came to believe that it had all been planned in advance (Birchall and Knight 2022a). But in the 1990s, three quarters of Americans thought that was a conspiracy involved in the assassination of President Kennedy (P. Knight 2007). Various detailed studies have failed to find evidence of any significant increase over time in levels of popular belief in conspiracy theories (Uscinski et al. 2022). Moreover, accurately measuring popular belief in conspiracy theories is not straightforward. This is in part because some surveys focus on specific theories whilst others focus on a more general disposition, and they are not always using the same scales over time. Compounding the problem is the suspicion on the part of some respondents that such polls are themselves part of a wider plot of experts and elites to ridicule some groups of voters by showing their tendency for “irrational” beliefs.
Even if it turned out that there are more conspiracy theories in circulation and reported levels of belief in them are indeed higher, we still need to consider the connection with changes of behaviour, whether that is voting, vaccination or violence. Recent forensic research into social media use in the 2016 US election, for example, found that exposure to election misinformation and conspiracy theories online didn’t significantly change people’s voting behaviour (Guess et al. 2023). Of course, with many US elections so tight, it doesn’t take much to tip the balance one way or the other.
There has clearly been an increase in the availability and acceleration of conspiracy narratives, especially in the online realm. However, it’s still not clear that conspiracy theories are in fact more pervasive or more persuasive than they have been in previous historical moments. What we can say is that they are more visible, and there is more concern about them (Butter 2020). They are also no longer stigmatised in the same way they were until recently—politicians like Trump are only too willing to weaponise them.
The circulation of lies, propaganda and conspiracy theories in the online world can indeed create enormous harm. But they’ve always been an issue. The internet hasn’t created the problem, but it has intensified it (Birchall and Knight 2022b).
Myth #2
“The recommendation algorithms of social media platforms are responsible for pushing people down the rabbit hole and making societies dangerously polarised. If social media companies can be forced to adjust their algorithms, we can fix the problem.”
Several important recent studies have documented the role that social media algorithms have played in spreading conspiracy theories and stoking polarised violence. With their apocalyptic urgency, scapegoating and sensationalism, conspiracy theories provide a perfect vehicle for stoking moral outrage. The claim is that the recommendation algorithms of social media platforms have hot-housed extremism and polarisation. Moral outrage increases engagement, and increased engagement grows advertising revenue.
Investigative journalism and whistle-blower reports have shown, for example, how Facebook’s newsfeed fuelled ethnic violence in Myanmar in 2013 and Sri Lanka in 2018 (Fisher 2022). Other studies have focused on the role of social media algorithms in the proliferation of misogyny, racism and extremist violence closer to home. The claim is that the recommendation engines and gamified engagement tools were designed to bring people together, but increasingly they work to nudge people down the “rabbit hole” to ever more extreme content (Roose 2019). There is indeed some truth to this idea: researchers found that if people watch an alternative health video on YouTube out of curiosity and follow the algorithm’s suggestion for what to watch next, within a few iterations they are being fed anti-vaxx conspiracy documentaries or neo-Nazi propaganda (Munn 2019; Ribeiro et al. 2019). However, in the wake of the Charleston and other mass shootings around 2019, social media platforms responded to public criticism about the power of their algorithms. They tweaked their algorithms to stop promoting conspiracist and other extremist content in such an obvious fashion. Whilst in the most severe cases such as QAnon and Covid they engaged in mass deplatforming, for the most part they demoted and demonetised offending content, making it less likely to be encountered by unwitting viewers (Faddoul, Chaslot, and Farid 2020). Of course, in the last year or so we have seen how Elon Musk’s takeover of Twitter and Mark Zuckerberg’s reversal of some of Facebook’s content moderation policies have produced a resurgence in online hate-speech and conspiracy theories.
However, it’s not necessarily the case that the recommendation engines are the most important factor in pushing people towards conspiracism and extremism. When researchers investigate how actual users come to particular extremist content, it is now rarely via the platform’s own search engine or its recommendation algorithm (Chen et al. 2023). Increasingly the evidence suggests that people come to a particular YouTube video from links included on, say, a Telegram group they already belong to. In effect, personal recommendations now seem to account for the pathways to extremist content more than the platform’s algorithmic recommendation. Even when potentially harmful content is deplatformed or demoted, people still seek it out (a lot of dubious content is still there on the mainstream platforms, and what has been deplatformed often reappears on alt-tech sites such as BitChute and Rumble).
Many data studies of social media have shown that misinformation travels in the online realm faster and further than accurate scientific information. However, most online users actually come across relatively little outright fabricated misinformation (which is usually measured by looking at an agreed list of low-quality news websites). One study, for example, found that “fake news comprises only 0.15% of Americans’ daily media diet” (Allen et al. 2020). Instead, a small minority are avid consumers of misinformation and conspiracy theories (Altay, Berriche, and Acerbi 2023; Budak et al. 2024). In most cases they are actively seeking it out because it fits with their existing world view—in particular, they already distrust mainstream media and politicians. For the most part, online misinformation and conspiracy theories preach to the choir; people are not necessarily being pushed down the rabbit hole by the algorithm. In short, we need to think about what’s driving the demand for conspiracy theories, rather than focusing solely on the supply pipeline or viewing online consumers of misinformation as brainwashed. In a recent book profiling half a dozen people who fell heavily for QAnon conspiracy theories, for example, many of the stories begin not with obsessive online research pushed by recommendation algorithms but with an emotional or medical tragedy that derailed their lives (Cook 2024). In addition, there has been too much focus on the mechanisms of online amplification, and not enough on the role that mainstream politicians, media outlets and conspiracy grifters play in spreading conspiracy speculations and other misleading stories.
If it did turn out that the recommendation algorithms are the key driver of conspiracy-fuelled violence and partisan polarisation, then the obvious solution would be to force platforms to change them or switch them off. This position is appealing, because it holds out the hope that the problem is comparatively simple to fix. Certainly, platforms need to be forced to be more accountable for the social harms they cause. But focusing solely on the algorithms assumes that the platforms would want to fix the problem. Testimony from whistle-blowers indicates that the platforms have long been aware of problems caused by their recommendation algorithms but have been reluctant to do much about it (Brill 2024). Provoking ever greater waves of moral outrage is not merely an unfortunate side effect but is hard-wired into the platforms’ recommendation algorithms. In short, amplifying conspiracy theories and other polarizing content is a feature not a bug of their business model.
Myth #3
“There is a clear psychological and demographic profile of a conspiracy theorist. They are wired differently to the rest of us.”
In the last decade there has been a great deal of fascinating empirical research by social psychologists and political scientists into who believes in conspiracy theories and why (Butter and Knight 2020). Although the usual picture of a conspiracy theorist is loner and a loser, a white man feverishly tapping away at his keyboard in his parents’ basement, the reality is more complicated. Research shows that there are few significant differences in terms of gender, race, age, or political ideology when it comes to conspiracy thinking. The only demographic factor that does stand out is level of education (and, relatedly, socio-economic status): the less educated people are, the more they are likely to believe in conspiracy theories. However, there is considerable variation within these headline findings.
Psychologists have developed an ever more detailed picture of “the conspiratorial mind” (Bowes, Costello, and Tasimi 2023). They have found correlations between conspiracy thinking and personality traits such as paranoia, narcissism, suspiciousness, and cynicism, as well as feelings of alienation, uncertainty, powerlessness, anxiety and loss of control. They have studied how conspiracy theories speak to a need for closure, a need to feel superior, a need to feel safe and a need for community. Other psychologists have investigated the flawed habits of thinking involved in conspiracy theorising, such as “confirmation bias” (favouring evidence that confirms your prior values) and the “conjunction fallacy” (a faulty estimation of probability). Some have even suggested that conspiracy theorising is hard-wired into our brains, part of an evolutionary adaptation—it’s better to connect the dots and spot a tiger where there isn’t one, than to fail to spot an actual tiger (van Prooijen and van Vugt 2018).
The focus of much of this research (in the title of a popular book) is on “why people believe weird things” (Shermer 1997). However, focusing on why people believe weird things is the wrong question to ask. First, believing in conspiracy theories is widespread and quite normal: surveys repeatedly shown that at least half of people in the UK and the US believe in at least one conspiracy theory. In fact, researchers are convinced that if they can add in enough examples on their questionnaires, then it will turn out that pretty much everyone is a believer (Uscinski et al. 2022). I’m sceptical about the value of these surveys because they are often presented in an alarmist fashion, and they fail to capture the complexity of what it means to believe or not believe in something. However, they do point to something significant: all of us share to a greater or less extent the habits of thought that underpin conspiracy theorising. Favouring interpretations that fit cherry-picked facts is quite common, not just in politics but in all walks of life—you might well suspect that I’m doing it in this lecture. Using narratives and explanatory frameworks to selectively organise information is something we all do, and so it becomes virtually impossible to clearly demarcate misinformation and conspiracy theories as categories that can be easily identified, measured, and corrected.
Second, much of this research takes a universalising approach in its search for the common psychological drivers and demographic factors of conspiracism. But conspiracy theories can operate in unexpected ways in different cultures and historical moments (Butter and Knight 2016). Focusing on issues of individual personality difference means that we fail to consider the broader patterns; for example, the more unequal, corrupt and polarised a society, the higher levels of belief in conspiracy theories (Hornsey and Pearson 2022).
Third, in many ways the psychology of individuals and their beliefs are irrelevant. Instead of wondering why people believe weird things, we need to focus more on the effects of those beliefs. At the end of the day, it doesn’t matter, for example, if Trump really believes all the implausible conspiracy theories he spouts. The point is that conspiracy narratives are useful to Trump’s political project, firing up supporters by dividing the world into Us and Them. In short, conspiracy theories are increasingly being weaponised to advance nationalism and populism in many countries (Bergmann 2024). But it’s not merely a top-down phenomenon: conspiracy theories and disinformation are increasingly participatory and circular. On fringe parts of the internet, supporters float conspiracy speculations, which are picked up and amplified by political leaders, partisan media, opportunistic grifters and foreign agencies. Then in turn these narratives are endorsed by the grassroots supporters who find “confirmation” of their original speculations by these seemingly more legitimate sources (S. Knight, Birchall, and Knight 2024).
Myth #4
“Conspiracy theories are the result of a lack of accurate information. If we can get the correct facts out there, people will change their minds.”
A common misconception is that belief in conspiracy theories is the result of a lack of accurate information or the circulation of mistaken information, whether accidental or deliberate. The assumption is that no one in their right mind would believe in such bizarre claims, unless they were the victims of a concerted campaign of deception and manipulation The idea is that people are fundamentally rational, and that they will adjust their beliefs when new evidence comes to life. And the implication is that if only we can transmit the correct information, then the mistaken belief will disappear.
But the reality in many cases is that misinformation doesn’t turn people into conspiracy theorists. Instead, conspiracy theories often provide people with ready-made narrative justifications for identity positions they have already assumed (van Prooijen and Böhm 2023). Although the usual picture of a conspiracy theorist is a loner, the process of developing conspiracy interpretations in online communities can give people a sense of community, purpose and belonging (Grusauskaite et al. 2023). Conspiracy theories need to be understood as collective, sensemaking narratives that help bolster worldviews, rather than as pieces of misleading information that alter individual beliefs.
Often the focus is on why people who believe in such theories have such low levels of trust. Instead, we should be asking why so many key institutions seem so untrustworthy, especially to those who feel marginalised or ignored in society. Even if the literal claims are mistaken or exaggerated, often the underlying sense of resentment is based on reality. Conspiracy theories can function as a public performance of an aggrieved sense of victimhood, stoking a sense of moral outrage, often coupled with creating scapegoats to blame for the current situation. This can happen across the political spectrum. Many Covid-19 conspiracy theories, for example, tapped into histories of medical exploitation, particularly for minority communities. It’s also clear that many theories resonated with a sense that the US healthcare system, particularly the relationship between big pharma and health insurance, is not always conducted in ways that puts patient interests first. That’s not to say that the conspiracy theories are “correct,” merely that they are not completely unconnected with reality. Most people don’t fall for conspiracy theories because they are especially gullible or easily brainwashed. Instead, there’s a mixture of motivated reasoning and tribal loyalty. In surveys, most people agree that online misinformation is a grave threat to democracy. But research also shows that most people think misinformation is a threat to society because other people are gullible, never themselves (Altay, Berriche, and Acerbi 2023). All of us are susceptible to confirmation bias to some extent: we believe what we want to believe, and we find evidence that fits our existing intuitions. It’s often hard to change people’s worldview because it is tied up with their fundamental sense of identity. To challenge someone’s belief in particular facts can amount to challenging their sense of self.
Fact-checking is a necessary, thankless, and heroic task, but on its own is unlikely to be sufficient. As I’ll explain in a moment, researchers have begun to move away from relying on debunking as a counter-disinformation strategy and instead have begun to propose new solutions such as media literacy education and “prebunking.” However, a high-profile study released this September (Costello, Pennycook, and Rand 2024) seems to suggest that current thinking on how to tackle conspiracy beliefs is wrong. Most other studies have found that fact-checking corrections only have a limited success at changing people’s minds, and the effect of the intervention tends to wear off quite quickly. But in this new experiment the researchers used an AI chatbot to politely argue against volunteers who said they believed in particular conspiracy theories. Surprisingly, the participants’ self-reported beliefs were reduced by 20% on average, which is more significant than most other studies in this field; even more striking, the effect seemed to persist over time. Usually debunking has little effect on the committed believer. I’ll skip over the objection that a 20% reduction is perhaps not as dramatic as it might seem: if I was 100% sure that the moon landings were faked, and now I’m 80% sure, I still think the moon landings were faked. Instead, the more important point is that the DebunkBot study suffers from the same weakness of much counter-disinformation work. You might succeed in debunking a particular mistaken piece of information, but it’s much harder to change the underlying framing narrative. People often believe something not because of the particular misinformation they’ve heard. Rather, the stories they hear provide confirmation for what they’ve long suspected, even it turns out that this particular factoid has been debunked.
For example, J.D. Vance’s recent remarks about Haitian immigrants in Ohio eating their neighbours’ dogs and cats demonstrate the logic of “it may as well be true” that drives much conspiracy speculation. When challenged by fact-checkers, Vance at first seemed to acknowledge that it might have been a rumour and not a proven fact. But then he doubled down on the idea, suggesting that, even if not literally true, the story revealed an important truth about migration that was supposedly being kept secret by the mainstream media (Halpert 2024).
The force of the “it may as well be true” defence should also make us sceptical about some of the current discussions about Generative AI, deep fakes and conspiracy theories. There are indeed cases of fairly convincing AI-generated fakes, such as a current one from the US election supposedly showing recently-arrived Haitians voting in Georgia (Sardaridazeh and Robinson 2024). However, the problem at the moment is less GenAI deep fakes than so-called cheap fakes. Online conspiracy theorising about the war in Ukraine, the war in Gaza and the US election often deploys crude fakes or uses genuine but unrelated images and footage taken from other times and places. Often the misleading images and videos continue to go viral even after they have debunked. People circulate what they feel to be true, even when they found out that in a literal sense the image is mistaken. In most cases, the fact-checking corrections are quickly available online, but they don’t tend to change people’s minds. When challenged, people fall back on the get-out defence that, even if the particular fact-checked item is fake, it nevertheless shows a deeper truth. Researchers have found that posts about vaccination on social media that are misleading but not blatantly false get more traction than posts that have been fact checked and flagged, suggesting that fact-checking can miss some of the more influential stuff (Allen and Rand 2024). Finally, many people drawn to conspiracy theories now claim that fact-checkers, the mainstream media, and academic researchers are themselves part of the conspiracy, and so will happily ignore any corrections—or use them as evidence of a conspiracy of the liberal, media elite to suppress the truth.
Myth #5
“We are witnessing an infodemic of misinformation, disinformation and conspiracy theories. They spread virally online and infect people’s minds. The best cure is to eradicate the contagious information, and to provide inoculation for the vulnerable public.”
The threat of conspiracy theories—especially in the online environment—is often framed in terms of viral contagion. For example, in a speech in February 2020, the director-general of the WHO presented the emerging pandemic in these terms. “We are not just fighting an epidemic,” he explained. “We’re fighting an infodemic. Fake news spreads faster and more easily than this virus, and is just as dangerous” (Ghebreyesus 2020). Academic researchers, counter-disinformation organisations, and the platforms themselves have explored a variety of interventions to combat the “infodemic,” including fact-checking, automated content warnings and media literacy education. However, the intervention that seems to promise the most success is what is known as pre-bunking or inoculation—exposing people in advance in a controlled fashion to the kind of misinformation they are likely to encounter. Researchers have found some evidence of the success of this approach (van der Linden et al. 2017; Roozenbeek, Linden, and Nygren 2020). However, the effects tend to be comparatively short lived.
Leaving aside the question of its effectiveness, the problem with prebunking is that it relies on a series of metaphors that are unexamined at best and misleading at worst. The research talks of strategies of “inoculation” that build up “resistance” and even “immunity” to misinformation and conspiracy thinking (Linden 2023). This approach holds out the unrealistic hope of a silver bullet cure to the “disease” of conspiracism. Sometimes medical metaphors such as “inoculation” and “viral contagion” are meant quite literally. For example, some studies of Covid conspiracy rumours attempted to “model the spread of information with epidemic models, characterizing for each platform its basic reproduction number (𝑅0)” (Cinelli et al. 2022) Other studies tried to calculate the “incubation period” and “vectors of transmission” of particular pieces of misinformation (Ligot et al. 2021). However, most commentators use these metaphors merely as a convenient shorthand to suggest parallels between the way a virus spreads and the way conspiracy theories spread online. But ideas do not literally infect people’s minds like a virus. Prebunking and information inoculation cannot confer immunity. Discussions of the “infodemic” usually imply that social media is a particularly dangerous space of transmission, with viral memes able to bypass a user’s rational defence mechanisms. While cells have no conscious ability to resist infection by a bacterium or a virus, people do have some choice in whether to accept and pass on a particular piece of online content. At the very least, it is not inevitable that an individual recipient of online disinformation will succumb to its truth-altering message.
Although the idea that misinformation spreads virally makes intuitive sense, communications and cultural studies scholars have long since shown that the “hypodermic needle” model of media influence is not accurate. While some conspiracy entrepreneurs act as disinformation “superspreaders,” for example, their community of followers are not merely passive recipients of their messages. Conspiracy theories are created and consumed in complex networks that cannot be reduced to a story about individual control by sinister puppet masters or the impersonal logic of the technology.
Describing misinformation and disinformation as the result of “brainwashing” or a “mind virus” (even if that is meant metaphorically) suggests that no one in their right mind would believe in such bizarre claims, unless they were the victims of a concerted campaign of deception and manipulation. The implication, for example, is that Brexit wouldn’t have happened, and Trump wouldn’t have been elected if there hadn’t been online manipulation that brainwashed voters. There’s often an implicit refusal to take seriously the idea that people could choose Brexit or Trump willingly. The notion of an infodemic doesn’t allow the possibility that people can reach very different conclusions because they rely on different trusted sources of knowledge and interpretive worldviews. Nor does it address the possibility that some people are adopting those positions not because of a failure of rationality but because they are—cynically but rationally—engaging in propaganda and partisan cheerleading for the causes they believe in. Or, more troubling, they are adopting those positions semi-ironically to troll pious liberals.
The idea of “inoculation” relies on the idea people are the unwitting victims of sinister campaigns of online influence. While that is sometimes true, it can’t make sense of the possibility that some people actively seek out conspiracy theories. In promising a technological fix to a problem that seems technological in origin, prebunking doesn’t address the underlying social issues driving conspiracy belief. Although the metaphor of “inoculation” suggests a highly targeted and scientific counter-disinformation strategy, in some cases the actual interventions might be described more simply as critical thinking and media literacy education, both of which we desperately need more of in schools and communities.
The idea of inoculating people against conspiracy theories or misinformation also relies on there being a clear distinction between accurate and misleading information, enabling us to “quarantine” dangerously viral conspiracy theories. While in some cases that’s possible, in others it’s much harder to clearly demarcate some ideas as misinformation (Williams 2024). For example, Flat Earth speculation relies on crank science inspired by a nineteenth-century evangelical sect (Weill 2022). But conspiracy theories about the Great Replacement, the Great Reset or the Deep State can blur into a grey zone of more legitimate theories about how power operates in our current world.
Presenting online manipulation as a devastatingly effective form of targeted mind manipulation can also play into the hands of the social media platforms. The platforms have a vested interest in convincing potential clients that online advertising is much more targeted and efficient compared to traditional media. However, the reality is “that much of the attention for sale on the internet is haphazard, unmeasurable, or simply fraudulent” (Bernstein 2021). The uncomfortable irony is that counter-disinformation research can also contribute to the hype about the unique and unprecedented powers of persuasion of online media.
Myth #6
“Conspiracy theories and other forms of misinformation are causing a crisis of trust in science, the media and democracy.”
The final myth is really just the encapsulation of all the others. With events like the storming of the US Capitol in mind, political commentators and the wider public have worried that conspiracy-infused misinformation constitutes a grave threat to democracy. The common perception is that conspiracy theories are eroding trust in government agencies, the judiciary, scientists, and journalists. During the Covid pandemic, many public health authorities worried that anti-vaccination conspiracy theories would undermine their ability to manage the crisis. In a similar vein, NGOs worry that conspiracy theories harm efforts to combat climate change, and they stoke hostility towards immigrants. The idea that conspiracy theories are creating a crisis of democracy seems so true and obvious, but it betrays a fundamental misunderstanding. I’m not claiming that there isn’t a serious problem. In fact, the problem is even more troubling than most commentary assumes.
While there are indeed many reasons to fear for the future, most discussions of misinformation get the causal direction the wrong way round. Conspiracy theories circulated on social media are not the cause of a crisis of trust in government, scientific and media institutions. Rather, they are a symptom of populist distrust. Likewise, with its promise of removing gatekeepers and disrupting authorities, the technology of the internet in general lends itself to populist conspiracy theories and meshes with it ideologically. But it isn’t the root cause of the problem.
Conspiracy theories—at least in their literal claims—are often rooted in mistaken or misleading information. However, as I’ve been arguing, most conspiracy theorists don’t come to distrust the mechanisms of democracy because of some particular piece of misinformation they’ve read online. Instead, they are attracted to conspiracy theories and seek them out precisely because they have already begun to distrust those people and institutions they view as elitist. Even if they are wide of the mark in literal terms, conspiracy theories can speak to a sense of resentment that many people feel—sometimes with good reason, but at other times for the pleasure of imagining yourself victim of an evil cabal. Conspiracy theories not only make an emotional appeal to feelings of grievance, of being victimised, of being excluded, of being not listened to and condescended to. They also offer a compelling narrative explanation for why things seem so unfair. By identifying an imagined scapegoat, they can provide an emotionally satisfying channel for blaming others.
The reality in many cases is that mis- or disinformation doesn’t turn people into conspiracy theorists. Mere exposure to conspiracy theories online is not likely by itself to push an innocent victim down the rabbit hole. Instead, conspiracy theories often provide people with ready-made narrative justifications for identity positions they already assumed. If conspiracy thinking is partly a result of social inequality, unacknowledged grievances, and distrust of institutions, then any real solution must start with those underlying causes, rather than engaging in whack-a-mole attempts to eradicate the never-ending torrent of online misinformation. This unfortunately means that there are unlikely to be any quick technical fixes to the problem of conspiracy theories in the online world. But by understanding the true nature of the problem, we can begin to work towards a better solution.
© Professor Peter Knight, 2024
Further Reading
Birchall, Clare, and Peter Knight. 2022a. Conspiracy Theories in the Time of Covid-19. London: Routledge.
———. 2022b. “Do Your Own Research: Conspiracy Theories and the Internet.” Social Research: An International Quarterly 89 (3): 579–605. https://doi.org/10.1353/sor.2022.0049.
Brill, Steven. 2024. The Death of Truth: How Social Media and the Internet Gave Snake Oil Salesmen and Demagogues the Weapons They Needed to Destroy Trust and Polarize the World--And What We Can Do. Knopf Doubleday.
Butter, Michael. 2020. The Nature of Conspiracy Theories. Cambridge: Polity Press.
Butter, Michael, and Peter Knight, eds. 2020. Routledge Handbook of Conspiracy Theories. Abingdon, UK: Routledge.
Gatehouse, Gabriel. 2024. The Coming Storm: A Journey into the Heart of the Conspiracy Machine. BBC Books.
Hofstadter, Richard. 1964. The Paranoid Style in American Politics, and Other Esays. Cambridge, MA: Harvard University Press.
Merlan, Anna. 2019. Republic of Lies: American Conspiracy Theorists and Their Surprising Rise to Power. New York: Random House.
Muirhead, Russell, and Nancy L. Rosenblum. 2019. A Lot of People Are Saying: The New Conspiracism and the Assault on Democracy. Princeton, NJ: Princeton University Press.
Uscinski, Joseph E., and Joseph M. Parent. 2014. American Conspiracy Theories. New York, NY: Oxford University Press.
References
Allen, Jennifer, Baird Howland, Markus Mobius, David Rothschild, and Duncan J. Watts. 2020. “Evaluating the Fake News Problem at the Scale of the Information Ecosystem.” Science Advances 6 (14): 1–6. https://doi.org/10.1126/sciadv.aay3539.
Allen, Jennifer, and David Rand. 2024. “Combating Misinformation Runs Deeper Than Swatting Away ‘Fake News.’” Scientific American, September. https://www.scientificamerican.com/article/combating-misinformation-runs-deeper-than-swatting-away-fake-news/.
Altay, Sacha, Manon Berriche, and Alberto Acerbi. 2023. “Misinformation on Misinformation: Conceptual and Methodological Challenges.” Social Media + Society 9 (1): 20563051221150412. https://doi.org/10.1177/20563051221150412.
Bergmann, Eirikur. 2024. Weaponizing Conspiracy Theories. Routledge.
Bernstein, Joseph. 2021. “Bad News: Selling the Story of Disinformation.” Harper’s Magazine, August 9, 2021. https://harpers.org/archive/2021/09/bad-news-selling-the-story-of-disinformation/.
Birchall, Clare, and Peter Knight. 2022a. Conspiracy Theories in the Time of Covid-19. London: Routledge.
———. 2022b. “Do Your Own Research: Conspiracy Theories and the Internet.” Social Research: An International Quarterly 89 (3): 579–605. https://doi.org/10.1353/sor.2022.0049.
Bowes, Shauna M., Thomas H. Costello, and Arber Tasimi. 2023. “The Conspiratorial Mind: A Meta-Analytic Review of Motivational and Personological Correlates.” Psychological Bulletin 149 (5–6): 259–93. https://doi.org/10.1037/bul0000392.
Brill, Steven. 2024. The Death of Truth: How Social Media and the Internet Gave Snake Oil Salesmen and Demagogues the Weapons They Needed to Destroy Trust and Polarize the World--And What We Can Do. Knopf Doubleday.
Budak, Ceren, Brendan Nyhan, David M. Rothschild, Emily Thorson, and Duncan J. Watts. 2024. “Misunderstanding the Harms of Online Misinformation.” Nature 630 (8015): 45–53. https://doi.org/10.1038/s41586-024-07417-w.
Butter, Michael. 2014. Plots, Designs, and Schemes: American Conspiracy Theories from the Puritans to the Present. Berlin: Walter de Gruyter.
———. 2020. The Nature of Conspiracy Theories. Cambridge: Polity Press.
Butter, Michael, and Peter Knight. 2016. “Bridging the Great Divide: Conspiracy Theory Research for the 21st Century.” Diogenes, October. https://doi.org/10.1177/0392192116669289.
———, eds. 2020. Routledge Handbook of Conspiracy Theories. Abingdon, UK: Routledge.
Chen, Annie Y., Brendan Nyhan, Jason Reifler, Ronald E. Robertson, and Christo Wilson. 2023. “Subscriptions and External Links Help Drive Resentful Users to Alternative and Extremist YouTube Channels.” Science Advances 9 (35): eadd8080. https://doi.org/10.1126/sciadv.add8080.
Cinelli, Matteo, Gabriele Etta, Michele Avalle, Alessandro Quattrociocchi, Niccolò Di Marco, Carlo Valensise, Alessandro Galeazzi, and Walter Quattrociocchi. 2022. “Conspiracy Theories and Social Media Platforms.” Current Opinion in Psychology 47 (October):101407. https://doi.org/10.1016/j.copsyc.2022.101407.
Cook, Jesselyn. 2024. The Quiet Damage: QAnon and the Destruction of the American Family. New York: Crown.
Costello, Thomas H., Gordon Pennycook, and David G. Rand. 2024. “Durably Reducing Conspiracy Beliefs through Dialogues with AI.” Science 385 (6714): eadq1814. https://doi.org/10.1126/science.adq1814.
Faddoul, Marc, Guillaume Chaslot, and Hany Farid. 2020. “A Longitudinal Analysis of YouTube’s Promotion of Conspiracy Videos.” arXiv:2003.03318 [Cs], March. http://arxiv.org/abs/2003.03318.
Fisher, Max. 2022. The Chaos Machine: The Inside Story of How Social Media Rewired Our Minds and Our World. Hachette UK.
Ghebreyesus, Tedros Adhanom. 2020. “Munich Security Conference: Speech by the WHO Director-General.” February 15, 2020. https://www.who.int/director-general/speeches/detail/munich-security-conference.
Grusauskaite, Kamile, Luca Carbone, Jaron Harambam, and Stef Aupers. 2023. “Debating (in) Echo Chambers: How Culture Shapes Communication in Conspiracy Theory Networks on YouTube.” New Media & Society, April, 14614448231162585. https://doi.org/10.1177/14614448231162585.
Guess, Andrew M., Neil Malhotra, Jennifer Pan, Pablo Barberá, Hunt Allcott, Taylor Brown, Adriana Crespo-Tenorio, et al. 2023. “How Do Social Media Feed Algorithms Affect Attitudes and Behavior in an Election Campaign?” Science 381 (6656): 398–404. https://doi.org/10.1126/science.abp9364.
Halpert, Madeleine. 2024. “Trump’s Running Mate Vance Doubles down on False ‘pet-Eating’ Claims.” BBC News. September 15, 2024. https://www.bbc.com/news/articles/cgj447j5711o.
Hofstadter, Richard. 1964. The Paranoid Style in American Politics, and Other Essays. Cambridge, MA: Harvard University Press.
Hornsey, Matthew J., and Samuel Pearson. 2022. “Cross-National Differences in Willingness to Believe Conspiracy Theories.” Current Opinion in Psychology 47 (October):101391. https://doi.org/10.1016/j.copsyc.2022.101391.
Knight, Peter. 2007. The Kennedy Assassination. Edinburgh: Edinburgh University Press.
Knight, Sophia, Clare Birchall, and Peter Knight. 2024. “Conspiracy Loops: From Distrust to Conspiracy to Culture Wars.” Demos.
Ligot, Dominic, Frances Claire Tayco, Mark Toledo, Carlos Nazareno, and Denise Brennan-Rieder. 2021. “Infodemiology: Computational Methodologies for Quantifying and Visualizing Key Characteristics of the COVID-19 Infodemic.” SSRN Electronic Journal. https://doi.org/10.2139/ssrn.3771695.
Linden, Sander van der. 2023. Foolproof: Why We Fall for Misinformation and How to Build Immunity. Fourth Estate.
Linden, Sander van der, Edward Maibach, John Cook, Anthony Leiserowitz, and Stephan Lewandowsky. 2017. “Inoculating against Misinformation.” Edited by Jennifer Sills. Science 358 (6367): 1141–42. https://doi.org/10.1126/science.aar4533.
Munn, Luke. 2019. “Alt-Right Pipeline: Individual Journeys to Extremism Online.” First Monday 24 (6). https://doi.org/10.5210/fm.v24i6.10108.
Prooijen, Jan-Willem van, and Nienke Böhm. 2023. “Do Conspiracy Theories Shape or Rationalize Vaccination Hesitancy Over Time?” Social Psychological and Personality Science, June, 19485506231181659. https://doi.org/10.1177/19485506231181659.
Prooijen, Jan-Willem van, and Mark van Vugt. 2018. “Conspiracy Theories: Evolved Functions and Psychological Mechanisms.” Perspectives on Psychological Science 13 (6): 770–88. https://doi.org/10.1177/1745691618774270.
Ribeiro, Manoel Horta, Raphael Ottoni, Robert West, Virgílio A. F. Almeida, and Wagner Meira. 2019. “Auditing Radicalization Pathways on YouTube.” arXiv:1908.08313 [Cs], December. http://arxiv.org/abs/1908.08313.
Roose, Kevin. 2019. “The Making of a YouTube Radical.” The New York Times, June 9, 2019.
Roozenbeek, Jon, Sander van der Linden, and Thomas Nygren. 2020. “Prebunking Interventions Based on ‘Inoculation’ Theory Can Reduce Susceptibility to Misinformation across Cultures.” Harvard Kennedy School Misinformation Review 1 (2). https://doi.org/10.37016//mr-2020-008.
Sardaridazeh, Shayan, and Olga Robinson. 2024. “US Officials Say Russians behind Fake ‘Haitian Voters’ Video.” BBC News. November 1, 2024. https://www.bbc.com/news/articles/c9vnyl2jnpjo.
Shermer, Michael. 1997. Why People Believe Weird Things: Pseudoscience, Superstition and Other Confusions of Our Time. New York: W.H.Freeman.
Thalmann, Katharina. 2019. The Stigmatization of Conspiracy Theory since the 1950s: “A Plot to Make Us Look Foolish.” London: Routledge.
Uscinski, Joseph, Adam Enders, Casey Klofstad, Michelle Seelig, Hugo Drochon, Kamal Premaratne, and Manohar Murthi. 2022. “Have Beliefs in Conspiracy Theories Increased over Time?” PLOS ONE 17 (7): e0270429. https://doi.org/10.1371/journal.pone.0270429.
Walter, Annemarie S, and Hugo Drochon. 2020. “Conspiracy Thinking in Europe and America: A Comparative Study.” Political Studies, December, 0032321720972616. https://doi.org/10.1177/0032321720972616.
Weill, Kelly. 2022. Off the Edge Flat Earthers, Conspiracy Culture, and Why People Will Believe Anything. Chapel Hill, NC: Algonquin Books.
Williams, Dan. 2024. “Misinformation Researchers Are Wrong: There Can’t Be a Science of Misleading Content.” January 31, 2024. https://www.conspicuouscognition.com/p/misinformation-researchers-are-wrong.
Wood, Gordon S. 1982. “Conspiracy and the Paranoid Style: Causality and Deceit in the Eighteenth Century.” The William and Mary Quarterly 39 (3): 402–41. https://doi.org/10.2307/1919580.
© Professor Peter Knight, 2024
Further Reading
Birchall, Clare, and Peter Knight. 2022a. Conspiracy Theories in the Time of Covid-19. London: Routledge.
———. 2022b. “Do Your Own Research: Conspiracy Theories and the Internet.” Social Research: An International Quarterly 89 (3): 579–605. https://doi.org/10.1353/sor.2022.0049.
Brill, Steven. 2024. The Death of Truth: How Social Media and the Internet Gave Snake Oil Salesmen and Demagogues the Weapons They Needed to Destroy Trust and Polarize the World--And What We Can Do. Knopf Doubleday.
Butter, Michael. 2020. The Nature of Conspiracy Theories. Cambridge: Polity Press.
Butter, Michael, and Peter Knight, eds. 2020. Routledge Handbook of Conspiracy Theories. Abingdon, UK: Routledge.
Gatehouse, Gabriel. 2024. The Coming Storm: A Journey into the Heart of the Conspiracy Machine. BBC Books.
Hofstadter, Richard. 1964. The Paranoid Style in American Politics, and Other Esays. Cambridge, MA: Harvard University Press.
Merlan, Anna. 2019. Republic of Lies: American Conspiracy Theorists and Their Surprising Rise to Power. New York: Random House.
Muirhead, Russell, and Nancy L. Rosenblum. 2019. A Lot of People Are Saying: The New Conspiracism and the Assault on Democracy. Princeton, NJ: Princeton University Press.
Uscinski, Joseph E., and Joseph M. Parent. 2014. American Conspiracy Theories. New York, NY: Oxford University Press.
Part of:
This event was on Thu, 14 Nov 2024
Support Gresham
Gresham College has offered an outstanding education to the public free of charge for over 400 years. Today, Gresham plays an important role in fostering a love of learning and a greater understanding of ourselves and the world around us. Your donation will help to widen our reach and to broaden our audience, allowing more people to benefit from a high-quality education from some of the brightest minds.