How Coronavirus Created The Perfect Conditions For Conspiracy Theories

An interview with misinformation expert Claire Wardle on why false information thrives in a pandemic.
LOADINGERROR LOADING

The coronavirus pandemic has led to a flood of rumors and conspiracy theories across social media platforms and group chats — and they’re having consequences in the real world. People have drunk cleaning products, attempted to burn down cellphone towers and showed up at protests brandishing anti-vaccine signs. Although social media has struggled with misinformation from the start, rarely has it presented such an urgent threat to public safety on such a global scale.

Protesters hold banners and chant against state government measures intended to curb the spread of COVID-19 during an Open California rally on April 26, 2020, in San Diego.
Protesters hold banners and chant against state government measures intended to curb the spread of COVID-19 during an Open California rally on April 26, 2020, in San Diego.
Sean M. Haffey via Getty Images

HuffPost talked with Claire Wardle, co-founder of First Draft, a nonprofit organization that monitors misinformation, about why these falsehoods are proliferating during the pandemic and whether we can learn from their spread.

What is it about this particular event that’s made it so conducive to misinformation and disinformation?

There’s a couple of factors. One is that we’ve never really had a global story like this, where what we know is changing as quickly as it is. It’s almost like we’re on quicksand and the absence of information that we can really hang onto means that we have these information voids. That’s where we’re seeing this misinformation circulate.

Every week we’ve seen a different type of rumor. The earliest ones were the kind of WhatsApp and text messages saying, “Oh my god, there’s going to be a lockdown.” The thing is that wasn’t that far from the truth, and every single week when we see these rumors circulating, it comes from a place of everybody being frightened. It doesn’t matter your education level, it doesn’t matter really where you are or what you believe ― everybody is looking at this unprecedented situation and is desperately seeking information that will reassure them. Because we don’t have much of that information, we’re seeing misinformation spread.

It’s also this moment where almost everybody has got access to a phone or computer, and everybody is consuming and sharing more information around one single topic. It’s no surprise in that environment we’re seeing a lot of rumors, falsehoods and misleading information.

You mentioned initially there were those text messages that circulated false information. Has the spread of misinformation changed during the outbreak?

You know when we think back to the 2016 election, there were a lot of fears about the traditional fake-news site created with headlines that said, “Pope Endorses Donald Trump.” We now have less of that kind of content and a lot of what we’re seeing is rumors circulating in closed Facebook groups or WhatsApp groups. Earlier on, there were rumors about lockdowns, rumors about “10 things you need to know” ― like if you can’t hold your breath for 10 seconds, that means you’ve got coronavirus. Not low-level misinformation, but there was nothing that really felt like it was harmful to life.

In the last two to three weeks, however, we’ve seen a lot more traffic around conspiracy theories which have been present since January. Whether it’s about the 5G phone masts causing coronavirus or Bill Gates trying to microchip everybody or this virus was actually created in a lab as a bioweapon, that stuff has really come to prominence. I would argue that has happened at a point where people are starting to feel really out of control and people are looking for explanations for why their world has been turned upside down. People are turning to these simple narratives and we’re seeing them take hold.

It’s gone from pretty low-level rumors about how to treat potential symptoms or how to prevent it, into much more of “where does this come from and what does this mean.” We’re seeing kind of an intersection with anti-vaxxer movements because the next phase here is people starting to think about the vaccine. Last week, in particular, we’ve seen a lot more misinformation around Bill Gates. Even in the protest movements, we’re seeing intersections with anti-vax groups who already had online networks and now have become activated.

Have you seen a rise in people trying to profit off this virus and use it for personal amplification and gain?

I mean certainly. On the financial side, we’ve seen a ton of people buying up coronavirus-related URLs and domains. We’ve seen people trying to sell testing kits that are not actually there. We’ve seen people trying to take people’s information and phishing attacks. In many ways, the tactics we’re seeing here are the tactics that we always see. It’s just now we’re seeing them connected to coronavirus because people are scared and they’re more likely to click on something coronavirus related. It’s the most effective way to use those old-fashioned scam tactics to get people to give away their information or to click on sites that will drive financial gain.

In terms of how platforms are addressing this, Facebook issued a statement that they put a warning label on 40 million posts just in March alone, but there’s still clearly rampant misinformation on the platform. How platforms are adapting to this is the problem. Is it just kind of too big to control?

It’s a huge question because I think for the last three-and-a-half years, there’s been a lot of conversations about whether platforms are doing enough. We also haven’t really figured out how as a society we feel about different types of speech.

I think the platforms, quite rightly, have done more on coronavirus than I’ve ever seen them do before. But at the same time, I worry that there is a complete absence of oversight around what they’re taking down. You can read the policies and say, “Oh great, Twitter now has much stronger positions around taking down tweets if somebody goes against scientific advice.” In theory, that makes perfect sense. But they’re taking down a ton of stuff globally, and I don’t know what they’re taking down and who’s making those decisions.

On one hand, I think the strongest steps they’ve taken should be in some way applauded. My concern is the absence of oversight and absence of independent researchers to ascertain the impact of these takedowns. The level of conspiracies that still exist on YouTube I really struggle with, but at the same time, who’s studying this and really understanding what the harm might be? What we don’t have is longitudinal data about these conspiracies. My fear is if we’ve got 18 months of conspiracies about Bill Gates, where does that lead us to as a society? We have to think about the longer term rather than if one particular piece of content breaks the rules.

When this is all over, what are the key questions that you and other researchers are going to look back on about how misinformation spread and how platforms addressed this?

Well, we’re in the same position as we were in after 2016. We just won’t have an archive of information. Every day the team finds stuff and then if we don’t screen-grab it quickly enough, it gets taken down. Right now we’re in this huge natural experiment of a global story where misinformation is having serious impacts on real-world behavior. As researchers, we simply won’t be able to study what was there, how well it spread and how audiences changed their behavior because of it.

It kind of breaks my heart that most of the lessons from 2016 people have been shouting about, nobody moved on them enough and the platforms didn’t create environments where that kind of sharing of information could happen. Facebook has set up its oversight committee but I don’t actually think it’s started yet. We’ve got this huge infodemic, but nothing has really changed. In terms of what lessons will be learned in two years’ time? The answer is very few. A lot of this misinformation will have been taken down, and we won’t even know that it was ever there in the first place.

This interview has been edited and condensed for clarity.


A HuffPost Guide To Coronavirus

Before You Go

Popular in the Community

Close

What's Hot