The Big Question: What Does COVID-19 Reveal About Mis- and Disinformation in Times of Crisis?

TWEET | SHARE | PDF

With the COVID-19 pandemic dominating headlines worldwide, the spread of the virus is mirrored by the spread of mis- and disinformation online. While major political events such as elections or violent conflicts are typical flashpoints for disinformation activities, the current global public health crisis can shed light on the ways actors in the information space—from clickbait page operators to tech platforms to authoritarian regimes—respond to moments of crisis. The International Forum for Democratic Studies asked six leading experts what insights about disinformation and the broader information space can be gleaned from the COVID-19 pandemic and the ensuing flood of misleading information.


Cédric Alviani is head of Reporters Without Borders East Asia Bureau in Taipei. A French national and graduate from Strasbourg University’s journalism center, Cedric has worked in Asia since 1999, directing projects at the intersection of diplomacy, culture, and media.

The COVID-19 pandemic has provided new evidence of the relevance of the fight waged by Reporters Without Borders (RSF) for the free flow of news and information. In our globalized world, violations of this basic freedom are a matter for the entire international community, regardless of where they occur.

Without the censorship imposed by Beijing, the Chinese media would have informed the public about the seriousness of the epidemic much sooner, sparing thousands of lives and possibly avoiding the pandemic. Withholding information on the grounds of “not spreading panic” is unacceptable and aggravates problems instead of helping to solve them.

Conversely, Taiwan’s example shows that the flow of information at a time of crisis does not undermine effectiveness. By opting for transparency, the Taiwanese authorities succeeded in getting the public to comply with the public health measures they were recommending, and many think this has helped them to rein in the epidemic. Like the coronavirus, censorship knows no borders and can wreak havoc. News and information manipulation is a major obstacle to human progress. It restricts free will and destroys the ability to discern. By distorting the parameters of the debate, it inevitably leads to bad decisions which, in turn, must be covered up by new lies.

Democratic countries should mobilize to rid the world of this anachronistic practice and to ensure that the post-coronavirus era is one in which information flows freely for the benefit of all. Only if we do this will humankind finally be able to resolve the political, social, and environmental challenges it is facing.

News and information manipulation is a major obstacle to human progress. It restricts free will and destroys the ability to discern. By distorting the parameters of the debate, it inevitably leads to bad decisions which, in turn, must be covered up by new lies.

Cédric Alviani, Reporters Without Borders East Asia Bureau

Graham Brookie (@GrahamBrookie) is the Director and Managing Editor of the Atlantic Council’s Digital Forensic Research Lab (DFRLab), based in Washington D.C. Prior to joining the DFRLab, Brookie served as adviser on the National Security Council for strategic communications. 

The world has overcome pandemics before, but humans today are more connected and have more access to information than any point in history. By and large, this is a good thing. Scientists and medical experts can share information in real-time. The latest guidance can reach the public in an instant. However, the sustained demand (and necessity) for updates creates an information environment vulnerable to manipulation—what the World Health Organization has called an infodemic.

We’re going to be building on our understanding of the infodemic for months and years to come. In the meantime, we’re going to continue to see passive and pervasive misinformation alongside disinformation spread for geopolitical gain by nations, ideological gain by actors foreign and domestic, and economic gain by small actors selling snake oil to large actors manipulating volatile market conditions.

Another challenge is information suppression: during a public health crisis that spans the world, withholding information can put lives at risk. For example, the initial number of cases in China’s Wuhan Province were reportedly far greater than disclosed. That information could have helped curb the spread of COVID-19 early on.

Focus on COVID-19 is both global and hyper-local updates, and this is a different stressor for the information space. The latest information in your neighborhood or filter bubble may be different than across streets, borders, and oceans. This is cumulative and makes navigating the infodemic much harder for individuals and organizations alike.

Set against the noise of misinformation, the majority of social media platforms have prioritized promoting fact-based, publicly accountable sources, like local or national health agencies as a way of moderating false or misleading content about COVID-19. These steps are welcome and show what more platforms can do to curb misleading material given the urgency and will to moderate, but it has also shown limitations in moderating material for which there is a demand and that spreads not only on social media. It also begs the question of how this action will sustain beyond the current pandemic.

Set against the noise of misinformation, the majority of social media platforms have prioritized promoting fact-based, publicly accountable sources…these steps are welcome and show what more platforms can do to curb misleading material given the urgency and will to moderate, but it has also shown limitations in moderating material for which there is a demand and that spreads not only on social media.

—Graham Brookie, Atlantic Council Digital Forensic Research Lab

Sarah Cook (@Sarah_G_Cook) is a Senior Research Analyst for China, Hong Kong, and Taiwan at Freedom House. She directs the China Media Bulletin, a monthly digest in English and Chinese providing news and analysis on media freedom developments related to China. 

The Chinese party-state is relying on three key narratives to influence global understandings of the coronavirus pandemic: first, that the outbreak may not have originated in China; second, that China’s official data is accurate, the outbreak is under control, and the country’s response can serve as a model; and third, that China is now a global leader in medical aid, for which local populations are thankful.

Several things undermine the CCP’s intended message: early cover-ups which worsened the global outbreak, evidence of data inaccuracies, the CCP’s use of deliberate disinformation, the delivery of faulty medical products, and the revelation that “aid” was actually sold, not given freely.

As important as the message are the tactics. First, the Chinese influence apparatus mobilized existing channels to embed state media content into mainstream foreign media. This includes African news websites publishing Xinhua content based on long-standing exchanges and various Western news outlets running paid advertorials.

Second, recently created Twitter accounts belonging to foreign ministry spokespeople, Chinese diplomats, and entrepreneurs like Alibaba founder Jack Ma, a CCP member, are spreading these narratives alongside Chinese state media—despite Twitter being blocked in China. These accounts have promoted conspiracy theories claiming the US military brought the virus to China or proven falsehoods like the story that Italians called out from their windows to thank China for aid.

And third, several studies show how Twitter bots, hijacked accounts, and other inauthentic behaviors are being deployed to manipulate public perception. In one example, it was found that 37 percent of tweets with the hashtag #grazieCina (“Thank You China” in Italian) had been shared by such accounts.

While the first tactic is more typical of China’s efforts to influence international perception, the latter two are more closely associated with Russia. In some cases, Chinese messaging has piggybacked off existing Russian and Iranian networks on international social media. The combined use of existing and new channels matches trends in the evolution of Chinese foreign media influence since 2017 that Freedom House identified in a January 2020 study, “Beijing’s Global Megaphone.

Several things undermine the CCP’s intended message: early cover-ups which worsened the global outbreak, evidence of data inaccuracies, the CCP’s use of deliberate disinformation, the delivery of faulty medical products, and the revelation that ‘aid’ was actually sold, not given freely.

—Sarah Cook, Freedom House

Joan Donovan (@BostonJoan) is Director of the Technology and Social Change (TaSC) Research Project at the Harvard Kennedy School’s Shorenstein Center, where she leads the field in examining internet and technology studies, online extremism, media manipulation, and disinformation campaigns. 

Social media platforms and search engines are not optimized to handle global flows of knowledge, even when there is no pandemic. Platforms are primary built to serve advertising to audiences and information to consumers. Scientific knowledge, though, is expensive to produce and is often behind paywalls. As a result, search engines and platforms primarily prioritize cheap and free information. This information curation strategy is antithetical to handling the knowledge needed to serve the globe during pandemic. In an article about the design of technology and content moderation, I was analyzing a different kind of crisis, where white supremacists manipulated algorithms and other features of platforms and search engines to get their messages to the public. In that situation, tech companies had to augment their AI and remove offending accounts in order to tamp down their reach.

COVID-19 exacerbates and exposes these problems already apparent in the design of Internet communication technologies, where media manipulators use low-tech tactics to appear in search results, send malware, hide their identities, and target vulnerable groups. It is going to take a massive overhaul of the entire Internet ecosystem to thwart these nefarious actors, while at the same time serve the public what they need: timely, relevant, and local information on-demand.

Platforms are primary built to serve advertising to audiences and information to consumers. Scientific knowledge, though, is expensive to produce and is often behind paywalls. As a result, search engines and platforms primarily prioritize cheap and free information.

—Joan Donovan, Harvard Kennedy School Shorenstein Center

Nina Jankowicz (@wiczipedia) is Disinformation Fellow at the Wilson Center, where studies the intersection of democracy and technology in Central and Eastern Europe. 

It’s no surprise that disinformers of all stripes—from domestic hucksters peddling snake oil miracle cures to hostile states like Russia and China—are using the coronavirus pandemic to push disinformation. Everyone is fearful for their health and the health of their loved ones, and because the virus is so new, we lack information about how to protect ourselves. We are eager for information about the virus’s spread, about public safety, and about the government’s handling of the crisis. In other words: the coronavirus pandemic is a gold mine for disinformers. The most convincing disinformation runs on emotion, and there’s plenty of it here.

The Kremlin is never one to pass up a crisis like this, and so we’re observing Russian-backed purveyors of disinformation spreading narratives about what they describe as the West’s lackadaisical response. While these narratives exacerbate actual problems and worries of Western populations, this information warfare also has another audience: Russians themselves. Coupled with Russia’s recent performative aid sent to Italy, and later the United States, Putin can point to the disjointed, sluggish coronavirus response in the West and compare it to Russia’s own response (likely coupled with the suppression of true case statistics), making the argument that democratic systems cannot handle a crisis as well as authoritarian systems can.

We are eager for information about the virus’s spread, about public safety, and about the government’s handling of the crisis. In other words: the coronavirus pandemic is a gold mine for disinformers. The most convincing disinformation runs on emotion, and there’s plenty of it here.

—Nina Jankowicz, Wilson Center

Peter Kreko (@peterkreko) is director of the Political Capital Institute in Budapest, Hungary. He is also an associate professor in the social psychology department at ELTE University. He was formerly a Fulbright visiting professor at Indiana University. 

Whether we like it or not, we are living through history. The virus that surrounds us is changing institutions, political systems, societies. This crisis reveals the limits of populism, but also gives autocrats opportunities to entrench their power. In times of crises, voters usually stand behind authorities, and leaders today are receiving extra confidence from voters almost regardless of their handling of the crisis. This confidence can be used for good or ill.

In such times of crisis, disinformation and conspiracy theories spread virally. There is nothing new about this: in a classic book, Allport and Postman describe how high-stakes situations with high levels of information ambivalence are the perfect breeding ground for rumors. The current pandemic unquestionably fits into this category, and the spread of disinformation harms public understanding of the situation in at least four major ways: downplaying its importance, exaggerating and hystericizing, shifting blame to other actors, or promoting pipedream cures that do not exist in reality.

There are four steps that responsible stakeholders should take. First, they should call out sources of disinformation by name: authoritarian states such as Russia and China, clickbait sites, populist politicians, and even celebrities are sowing confusion and capitalizing on the crisis for power, money, and fame. Second, they should diminish the reach of disinformation. In times of information warfare, we cannot just think about a “marketplace of ideas.” Misleading information can be deadly—such as the false claim  that methanol can prevent infection. De-monetizing, burying, and removing such content is a must, and policy-makers and tech platforms should embrace this approach without abusing it to silence critics. Third, democratic voices must debunk authoritarian myths. Russia, China, and other authoritarian countries are concealing their true infection rate and “helping” other countries to make their handling the crisis look more successful, but this is nothing more than an unfortunately effective trick. And fourth, democratic governments should support academic research on disinformation. Science is the only thing that can push back the virus—and it is also effective at pushing back disinformation about the virus.

This crisis reveals the limits of populism, but also gives autocrats opportunities to entrench their power. In times of crises, voters usually stand behind authorities, and leaders today are receiving extra confidence from voters almost regardless of their handling of the crisis. This confidence can be used for good or ill.

—Peter Kreko, Political Capital Institute

Respondents’ answers have been edited for length and clarity, and do not necessarily reflect the views of the National Endowment for Democracy.

FOR MORE “BIG QUESTIONS,” CLICK HERE.


FOR MORE FROM NED’S INTERNATIONAL FORUM ON THIS TOPIC, SEE:

A series of three International Forum issue briefs focused on disinformation: “The ‘Demand Side’ of the Disinformation Crisis,” “Distinguishing Disinformation from Propaganda, ‘Fake News,’ and Misinformation,” and “How Disinformation Impacts Politics and Publics,” written by Dean Jackson.

Firming up Democracy’s Soft Underbelly: Authoritarian Influence and Media Vulnerability,” a working paper published by Edward Lucas as part of the International Forum’s “Sharp Power and Democratic Resilience” series.

A Power 3.0 Blog Post on “Beijing’s Viral Disinformation Activities,” by Jessica Brandt of the Alliance for Securing Democracy.

Demand for Deceit: How the Way We Think Drives Disinformation,” an International Forum working paper on the psychological drivers of disinformation, by Samuel Woolley and Katie Joseff.

 

Share