Digital Directions: March 3, 2022


Insights on the evolving relationships among digital technologies, information integrity, and democracy from the International Forum for Democratic Studies at the National Endowment for Democracy.

  • Kremlin disinformation operations are intensifying as Russia seeks to muddy the waters surrounding Russia’s aggression against Ukraine.
  • Platforms are taking bold new steps to limit the global reach of Russian state media.
  • Amid heightened geopolitical competition, governments are looking to shape new rules on  cybercrime and new approaches to digital technical standards. 
Share on Facebook Share on Facebook
Share on Twitter Share on Twitter
Forward to Friend Forward to Friend


how is russia using disinformation to muddy the information space in ukraine?

Russia’s use of disinformation to manipulate information ecosystems has been on full display during the country’s invasion of Ukraine. These efforts to spread misleading and/or demoralizing narratives about the war build on a long-term strategy developed not only to breed confusion around Russian military operations, but also to erode faith in democratic governance in Ukraine and beyond. Anne Applebaum and others have argued that Vladimir Putin fundamentally fears pro-democracy movements as a threat to authoritarian regimes. In this context, Russian disinformation seeks to halt the democratic trajectories of targeted countries, denigrating both democratic ideals and the role of Western countries—especially in what Moscow considers its rightful sphere of influence.

Moscow aims to depict democratic principles as destabilizing and democratic states as chaotic. In keeping with this overarching theme, it has repeatedly sought to undermine confidence in Ukraine’s democratic processes, leaders, and institutions. Distributed denial of service attacks and hacking attempts against Ukrainian election infrastructure as well as disinformation about Ukrainian presidential candidates all illustrate these efforts. Since 2014, Russia has used such disinformation “at the minimum, to demoralize; and at the maximum, to provoke a popular backlash against the Ukrainian government.” In the buildup to the current invasion, Russia has used cyber operations and disinformation narratives to “sow panic, to do everything to create a certain chaos in the actions of Ukrainians.”

Beyond seeking to spread confusion that saps faith in democratic governance, Moscow has peddled recurring propaganda narratives that actively deny Ukraine’s independent civic identity and stability. These tropes seek to de-legitimize Ukraine as a state, instead subsuming Ukrainian agency within a narrative of supposed Western aggression. Such efforts fit with propagandistic themes “positioning Russian Slavic Orthodox civilization in opposition to ‘decadent’ Europe.” Last year, pro-Kremlin outlets launched a flurry of disinformation aimed at “presenting Ukraine both as a relentless aggressor . . . and a hapless puppet of the US.” Pro-Russian information operations have spread false claims about Western democracies inciting conflict in Ukraine, invented charges of Ukrainian support for Nazism, and even assertions that U.S. military contractors were planning chemical attacks in the Donbas. These same themes came to the fore in Putin’s February 21 address, when he falsely claimed that Ukraine was developing nuclear weapons with Western support, and made sweeping ahistorical comments denying the legitimacy of modern Ukraine’s independent existence.

Beyond Russia, government-sponsored disinformation today makes up a central component of authoritarian efforts to undermine democratic norms and standards globally, subvert democratic societies, and obscure autocracies’ repressive and violent acts. Yet democratic governments are waking up to the threat these efforts pose, and launching initiatives to build societal resilience. To counteract the pernicious effects of authoritarian information manipulation, democracies need to capitalize on their own strengths. Whole-of-society approaches that leverage the tools, resources, and expertise of civil society organizations are crucial to spread accurate information, cultivate stronger information ecosystems, and counter the divisive and demoralizing tactics of autocrats and their supporters.

– Daniel Cebul and Elizabeth Kerley, International Forum for Democratic Studies



NOT ALL IS EQUAL IN INFLUENCE OPERATIONS INVESTIGATIONS: Building on a series of discussions with influence operation researchers, Carissa Goodwin and Dean Jackson’s new report for Carnegie’s Partnership for Countering Influence Operations highlights the disparities in access to resources, tools, and training that affect investigators in the Global South. Goodwin and Jackson highlight the need for “more financial support for actor-agnostic research” that would enable grassroots investigators to address influence operations by domestic actors. Discussion participants cited a “tendency by donors, governments, media, and other stakeholders to overemphasize the scale, reach, and impact of foreign (and especially Russia) influence operations in comparison to domestic operations.”

INFORMATION MANIPULATION AT THE 2022 WINTER OLYMPICS: Distributed online as well as through more traditional channels, the CCP’s Olympics-related propaganda efforts were employed in tandem with intensive censorship to obfuscate the party’s human rights abuses and undermine defenders of democracy. Before the games even began, Beijing intensified its well-resourced efforts to manipulate global public opinion by attacking foreign human rights advocates online, threatening governments participating in diplomatic boycotts of the Olympics, and claiming the U.S. plans to pay athletes to “sabotage” the games.



MOSCOW AND TECH PLATFORMS CLASH OVER MODERATION, STATE MEDIA: As the Kremlin further constricted Russia’s domestic information space in order to promote a sanitized version of its military assault on Ukraine, Moscow’s long-brewing conflict with Western social media platforms over content moderation practices rapidly escalated. Over the weekend, Russian authorities blocked Twitter and announced they would partly restrict Facebook due to a dispute over that platform’s downranking and labeling of state-owned Russian media. Meta and Google announced that they would stop accepting ads and de-monetize content from Russian state media, and Twitter expanded its labeling policy toward such outlets. As of February 28, Twitter had paused ads in Russia and Ukraine; YouTube and Facebook blocked some Russian state media in Ukraine; and Facebook announced it would also restrict RT and Sputnik in the EU, which declared its intention to officially ban the two Russian outlets.

A DISTURBING LOOK AT MODERATING FACEBOOK CONTENT IN AFRICA: A TIME investigation describes harsh working conditions for Facebook content moderators based in Kenya. Employees hired as moderators through Sama, an “ethical AI” outsourcing company based in California, told TIME that “the work that we do is a kind of mental torture.” Some reportedly resigned “after being diagnosed with post-traumatic stress disorder, anxiety, and depression.” They also complained of poor pay, were denied breaks, and underwent pressure to make speedy decisions and suppress labor organizing (a charge Sama denies). Facebook’s persistent content moderation challenges are particularly acute in Africa.



RUSSIA, UKRAINE, AND CYBERSECURITY: Access Now, with two Ukrainian civil society groups, recently called for international support in the face of escalating cyberattacks on Ukraine’s public service infrastructure. Their recommendations include countering disinformation while keeping the internet accessible; providing cybersecurity assistance for Ukrainian journalists and activists; and emphasizing human rights in global cybercrime discussions. The appeal comes just before the opening of UN discussions on a new cybercrime treaty and a proposal spearheaded by Russia three years ago.

RAISING DEMOCRACIES’ GAME ON TECH STANDARDS: A recent Chatham House report discusses incentives for deeper cooperation among the U.S., U.K., and EU on digital trade and technical standards, despite differing regulatory philosophies. The report notes that these actors share an interest in ensuring a “free, open and global internet” and a democratic approach to emerging technologies. Moreover, geopolitical competition and alarm over China’s “New IP” proposal have prompted governments to pay more attention to industry-dominated standards bodies. In this context, the authors recommend supporting civil society, academic engagement on standards, new coalitions with countries in the Global South.



Media and Encrypted Messaging Applications: Private Gatekeepers: Encrypted Messaging Apps and News Audiences,” a new report from the Center for International Media Assistance, explores how media outlets are using EMAs to circumvent restrictions in repressive media environments and reach audiences in the face of technical challenges, resource demands, and sometimes, political pressure.

Censorship, Surveillance, Human Rights, and the 2022 Beijing Winter Olympics: The International Forum for Democratic Studies has compiled a list of resources exploring how trends in censorship, surveillance, and human rights intersected at the 2022 Winter Olympics. The CCP leveraged technology and information controls to censor athletes and journalists, spread propaganda, and surveil Chinese citizens and foreign nationals.


Thanks for reading Digital Directions, a biweekly newsletter from the International Forum. 

Sign up  for future editions here!


International Forum's Website