Forum Q&A: Ethan Tu and Billion Lee on How Taiwan’s Civil Society Collaborates to Address Information Integrity Challenges from Beijing

2024 has been called the “year of elections” with over four billion people set to cast their ballots. The potential threat to the integrity of the information space is severe. For those concerned with how authoritarian powers might influence those elections, Taiwan is an illustrative case study. Earlier this year, Beijing launched sophisticated, AI-enabled information operations to undermine Taiwan’s elections. Taiwan’s civil society, tech community, and government responded with collaborative and innovative strategies that countered, in many ways, this malign influence campaign successfully.  

To draw lessons from Taiwan’s experience, John K. Glenn and Kevin Sheives of the International Forum for Democratic Studies spoke with Ethan Tu, founder of Taiwan AI Labs, a Taipei-based research organization that uses AI to expose information operations, and Billion Lee, co-founder of Cofacts, a fact-checking initiative that uses collaborative crowdsourcing and an automated chatbot to combat information manipulation online, to unpack Beijing’s efforts to undermine Taiwan’s information space and better understand why civil society’s response was so effective.  


Ethan, your organization, Taiwan AI Labs, released a report on the use of AI-powered malign influence campaigns surrounding Taiwan’s election. We know China has long sought to undermine Taiwan and its information space as part of its goal of “national reunification.” What was unique about this election?  

Ethan Tu: Taiwanese civil society uncovered many troll account groups active on social media, spreading false and misleading information widely. This misleading content was favorable to the People’s Republic of China (PRC). They targeted issues of social and economic concern to Taiwan’s population to demonstrate the PRC model’s supposed superiority.  

During this election cycle, troll account groups also focused on certain issues, like Taiwan’s relationship with the United States. These online troll accounts disseminated false or misleading information about the United States’ “promise to support Taiwan in its development of biological weapons” during a recent visit by former speaker of the House, Nancy Pelosi. 

These online actors also targeted Tawain’s relations with its democratic allies. Whenever news outlets celebrate Taiwan and Japan’s close relationship, for example, troll accounts flood the online space with false or misleading information—like false claims that Japan dumps nuclear wastewater off Taiwan’s coast. These troll account networks release similarly false and inflammatory content when Taiwan’s relations with other states are highlighted.  

Billion, you are a champion of collaboration with others outside of your organization. How have you operated Cofacts to ensure that information gets out to the public in Taiwan and elsewhere? What did these operations look like during Taiwan’s recent election season? 

Billion Lee: We provide a fact-checking chatbot service because it is the fastest tool we have to counter malign influence operations in real-time. People can access our information anytime and anywhere. We do not need our employees to work around the clock responding to information requests manually. This system makes our operations less expensive and less time-consuming. It also might be the best way to help people know the truth—we are transparent, and our information and source code is accessible to everyone. This is the only way to earn popular trust. Malign information operations will not be an issue if people trust government institutions and other official outlets. And while people may not trust the government, they might trust non-profits and civil society organizations. 

During the elections, these online troll accounts also targeted Taiwan’s ruling party, accusing it of being untrustworthy, which further undermines the public’s trust in government. For instance, during the COVID-19 pandemic, there were posts spread online by these accounts that falsely claimed COVID-19 came from the U.S. military spreading disease-related toxic materials inside Wuhan. Perhaps as a result last September, Taiwan’s mainstream media falsely reported that the U.S. government planned to invest in a lab in Taiwan to develop biological and nuclear weapons to use against the PRC. This story and the tactics that underlie its use are creative, but they are also harmful. 

Despite its proximity to China, Taiwan’s relative governmental and societal stability makes it unique. Having just had elections, let me ask you, what might this collaboration look like if the situation changed and there was a new government in power?  

Billion: It does not matter which government is in power—civil society will continue its work. Cofacts does not accept funding from the government, so our operations will remain unchanged. We also hope that everyone can value openness and transparency. Although the governing party may change, anyone can monitor and fact check it.  

It does not matter which government is in power—civil society will continue its work.

Civil society is certainly one actor responding to these information manipulation operations, but one must consider others as well. Ethan, how do tech companies fit in this conversation? 

Ethan: In the United States, the public is well-informed about technology. Many work for Google, Microsoft, Apple, or other tech companies. The instinct to join a top company and create a good, if not perfect, product is strong. Then, if successful, companies commercialize these products. 

This dynamic is different in Taiwan. My organization, PTT, social media, and Taiwan AI Lab were all started as nonprofit, open-sourced organizations whose work and products were shared with others. We gain and measure our success from our technology’s impact instead of our profit margins. 

Big tech companies have codes of ethics. They also can have trustworthy AI computing. Yet ultimately, they are concerned primarily with profit, as is necessary in a private company. Unfortunately, their code of ethics can be compromised by these profit-seeking priorities.  

What do you think non-profits need to know so that there can be better collaboration and mutual understanding between the tech community and civil society organizations? 

Billion: Various civil society organizations have their own work and areas of expertise. People working together across multiple fields is the true definition of collaboration. We cannot just push for greater collaboration or connectivity; we must offer tangible solutions that permit such work. Having an open API is Cofacts’ contribution.  

We think of open source and open data as a religion. It is a social good and we value it highly. We are committed to this principle. With our open Application Programming Interface, often referred to as API (a protocol for sharing data between systems that is easy to use and secure), other companies, organizations, or global partners do not need to develop their own API, nor do individuals need to incur the cost to hire new engineers or purchase new software to launch a new project on their own. They can use our source code and API. 

We think of open source and open data as a religion.

Ethan: Civil society generally distrusts big tech companies, so they avoid engagement with them. Also, civil society does not know how dangerous the misuse of AI is or how convincing manipulated information can be. In the United States there have been concerted efforts to craft legislation on AI that exposes the problems in big tech companies and asks these companies to follow their code of ethics more strictly; however, this issue has proven difficult to legislate.  

Artificial intelligence is a black box, and since it is a black box, platforms claim they are not responsible for any of the issues that arise from its use. When pressed to explain how these technologies operate, they claim it is a secret of their algorithm and confidential. When a company is not responsible for its own algorithm or platform, freedom of speech is undermined. Although tech companies claim they protect free expression, they still fail to explain how their technologies operate clearly, and, thus, they protect the right of information manipulation instead.  

Civil society generally distrusts big tech companies, so they avoid engagement with them. Also, civil society does not know how dangerous the misuse of AI is or how convincing manipulated information can be.

Puma Shen, one of the founders of Taiwan’s Doublethink Lab, once suggested: “[I]f you really want to persuade the public, I think the best way is that the government has to tell the public, hey, we’ve got a huge issue about fake news and disinformation, but then let the non-profit organizations take over.” Does that ring true to you? 

Billion: It does to a certain degree, but that does not mean only non-profits or civil society organizations are inherently trustworthy. We all understand that governments wield significant influence relative to civil society organizations and activists. They’ve got tax systems. They can send someone to prison. They can send you to jail. They can have their own national media. So, compared to a government, civil society might not have much power. That is why democracy is important: Civic engagement and public discourse are vital to democracy because they buttress the credibility and reach of CSO’s activity and messaging. 

Ethan: Private-public partnerships are one area worthy of further consideration. Governments should acknowledge the importance of these relationships and enact policy and legal mechanisms to encourage these partnerships; however, government agencies should not be the principal evaluators. They should rely on the third-party, civil society organizations to conduct these evaluations and establish open-source agreements to maintain popular trust in these partnerships and uphold access to data that demonstrates their impact. 

Billion, you work not only in the digital space, but you are also active in local communities offline. How do you connect the digital and real worlds?  

Billion: Taiwanese people communicate often on closed, online chatrooms. As such, it is difficult to monitor malign influence when it infiltrates these spaces. Most of the users are also over 50, and Cofacts wants to reach an even broader swath of society. We want to reduce the gap between those who can access and use our tools from those who cannot.  

Our engagement efforts take place both on- and offline. Of course, we offer online courses for users to learn how to identify and counter false and misleading information. For our in-person meetings, we collect participant feedback on issues of concern and offer fact-checking training.  We discuss what kinds of media we might choose to consume and what kind of content is untrustworthy. 

These workshops also target civic empowerment. We train open-source talent (i.e., people with experience accessing and using open-source data and tools) to reach all the data and information online, or to dive into their area of expertise. Finally, we offer media and digital literacy training. Considered together, we hope these efforts nurture a society of citizen fact-checkers. 

When you both look ahead, what are you seeing on the horizon in terms of this challenge? 

Billion: TikTok, YouTube, and other platforms have content that is free to access by all. The advantage of this kind of free, widely accessible, and addictive content for malign actors is the ease with which they can manipulate it. These actors also disseminate their content in a variety of ways. 

Propaganda on taxis and buses in Taipei, for instance, claimed that if you vote for some party, you are voting for war. As a society, we must be conscious about these methods and the vulnerability of our offline and online spaces to malign influence. 

Ethan: We see the potential for greater manipulation of AI technologies to reinforce and amplify malign information campaigns. We cannot underestimate the destructive power of information manipulation, nor can we overlook its reach, especially when enhanced by AI. 

These tools are used widely in Taiwan, allowing malign actors to influence different target groups with specifically tailored narratives. Civil society organizations that work in this space worldwide should work with Taiwanese civic actors because we have extensive experience confronting online malign influence. 

There are also efforts around the world to legislate access to trustworthy information more consistently, but pressing questions remain. How can we build up societal resilience to malign influence? How can we rebuild trust? While there are important and noteworthy efforts to reinforce public awareness, more work is needed. 

Information manipulation campaigns are designed to change your views and influence election outcomes, and their success encourages autocracies around the world to continue supporting these malign influence initiatives. If we don’t recognize this dynamic, we won’t be able to counteract it.  

We see the potential for greater manipulation of AI technologies to reinforce and amplify malign information campaigns. We cannot underestimate the destructive power of information manipulation, nor can we overlook its reach, especially when enhanced by AI.

Ethan Tu is the founder of Taiwan AI labs, which is a Taipei-based research organization that uses AI to expose information operations. Ethan previously served as the principal development manager in Microsoft’s AI & Research Group, and is the founder of PTT, a Taiwanese social media network. He holds a degree in computer science from NTU.

Billion Lee is the cofounder of Cofacts, a fact-checking initiative that uses collaborative crowd-sourcing and an AI-powered chatbot to combat information manipulation online. Since Cofact’s establishment in 2016, she has organized over 2,000 contributors and built a database of over 100k entries tracking fake and misleading news. She earned her degree in political science from NTU.

This interview has been condensed and edited for clarity by John Engelken and Amaris Rancy. The views and opinions expressed here do not necessarily reflect those of the National Endowment for Democracy.

Read MORE “FORUM Q&AS.”


FOR MORE FROM NED’S INTERNATIONAL FORUM ON THIS TOPIC, SEE:

Taiwan on the Frontline of China’s Information Operations,” a Power 3.0 blog post by Ko Shu-ling.

A Forum Big Question on “Borrowing Boats: How are Key Influencers in Latin America Amplifying CCP Narratives about Authoritarian Models?

Lessons from Ukraine: How AI is Accelerating the Response to Authoritarian Information Manipulation,” a Power 3.0 podcast episode featuring Ksenia Iliuk.

Share