close
close

Misinformation really does spread like a virus, mathematical models from epidemiology suggest

We are increasingly aware of how misinformation can influence elections. About 73% of Americans say they see misleading election news, and about half have difficulty distinguishing what is true from false.

When it comes to misinformation, “going viral” seems to be more than a simple buzzword. Scientists have found a close analogy between the spread of misinformation and the spread of viruses. In fact, the spread of misinformation can be effectively described using mathematical models designed to simulate the spread of pathogens.

Concern about misinformation is widespread. A recent UN survey found that 85% of people worldwide are concerned about it.

These concerns are justified. Foreign disinformation has increased in sophistication and scope since the 2016 US election. The 2024 election cycle saw dangerous conspiracy theories about “weather manipulation” that would undermine proper hurricane management, fake news about immigrants eating pets that incited violence against the Haitian community, and misleading election conspiracy theories promoted by the richest man in the country World, Elon Musk, were reinforced.

Recent studies have used mathematical models from epidemiology (the study of how diseases occur in the population and why). These models were originally developed to study the spread of viruses, but can also be effectively used to study the spread of misinformation on social networks.

One class of epidemiological models that is suitable for misinformation is the Susceptible-Infectious-Recovered (SIR) model. These simulate the dynamics between susceptible (S), infected (I) and recovered or resistant individuals (R).

These models are generated from a series of differential equations (which help mathematicians understand rates of change) and are easily applied to the spread of misinformation. For example, on social media, misinformation is passed from individual to individual, some of whom become infected while others remain immune. Others serve as asymptomatic vectors (disease vectors) and spread misinformation without knowing about it or being negatively influenced by it.

These models are incredibly useful because they allow us to predict and simulate population dynamics and determine measures such as the basic reproduction number (R0) – the average number of cases generated by an “infected” person.

As a result, there is growing interest in applying such epidemiological approaches to our information ecosystem. Most social media platforms have an estimated R0 greater than 1, suggesting that the platforms have the potential for the epidemic spread of misinformation.

Looking for solutions

Mathematical modeling typically involves either so-called phenomenological research (in which researchers describe observed patterns) or mechanistic work (in which the aim is to make predictions based on known relationships). These models are particularly useful because they allow us to examine how potential interventions can help reduce the spread of misinformation on social networks.

We can illustrate this basic process with a simple illustrative model shown in the graphic below. This allows us to examine how a system might evolve under various hypothetical assumptions, which can then be tested.

Prominent social media personalities with large followings can become “super-spreaders” of election disinformation, potentially spreading falsehoods to hundreds of millions of people. This reflects the current situation in which election officials report being outgunned in their attempts to fact-check misinformation.

If we conservatively assume in our model that the chance of infection after exposure in humans is only 10%, then studies show that debunking misinformation has little effect. Under the scenario of a 10% probability of infection, the population infected by election misinformation grows rapidly (orange line, left panel).

Misinformation really does spread like a virus, mathematical models from epidemiology suggest
A “compartmental” model of disinformation distributed over a week in a cohort of users, where the probability of disinformation infecting a susceptible unvaccinated person if exposed is 10%. Unmasking is assumed to have an effectiveness of 5%. When prebunking is introduced and is about twice as effective as debunking, the dynamics of disinformation infection change significantly.
Sander van der Linden / Robert David Grimes

Psychological “vaccination”

The virus spread analogy for misinformation is apt precisely because it allows scientists to simulate ways to counter its spread. These interventions include an approach called “psychological inoculation,” also known as “prebunking.”

This is where researchers preemptively introduce a falsehood and then refute it so that people become immune to misinformation in the future. It is similar to a vaccination, in which people are given a (weakened) dose of the virus to prepare their immune systems for future challenges.

For example, a recent study used AI chatbots to develop counterarguments to common election fraud myths. This included warning people in advance that political actors could manipulate their opinions with sensational stories, such as the false claim that “massive vote losses overnight will skew the election,” and important tips on how to spot such misleading rumors. These “vaccinations” can be incorporated into population models of the spread of misinformation.

You can see in our graph that it takes much longer for people to build immunity to misinformation when prebunking is not used (left panel, orange line). The right panel shows how the number of misinformed people can be curbed when prebunking is used on a large scale (orange line).

The purpose of these models is not to make the problem seem scary or to suggest that people are gullible vectors of disease. However, there is clear evidence that some fake news spreads like a simple contagion, infecting users instantly.

Other stories, however, are more akin to a complex contagion in which people must be repeatedly exposed to misleading sources of information before they become “infected.”

The fact that individual susceptibility to misinformation may vary does not affect the usefulness of epidemiological approaches. For example, the models can be adjusted depending on how hard or difficult it is for misinformation to “infect” different subpopulations.

Although it might be psychologically uncomfortable for some to think about people in this way, most misinformation is spread by a small number of influential superspreaders, just as viruses are.

An epidemiological approach to studying fake news allows us to predict its spread and model the effectiveness of interventions such as prebunking.

Some recent work confirmed the viral approach using the social media dynamics of the 2020 US presidential election. The study found that a combination of interventions can be effective in reducing the spread of misinformation.

Models are never perfect. However, if we want to stop the spread of misinformation, we need to understand it in order to effectively counteract its damage to society.