Forum Q&A: Irene Mwendwa on Women, Data, and Democracy

As AI technologies assume increasingly prominent roles in everyday life and government decision-making, building representative tools that reflect democratic values is more important than ever. Doing so will require a multi-pronged approach that rethinks how we design AI systems from the bottom-up. To better understand this challenge, Beth Kerley of the International Forum for Democratic Studies sat down with Irene Mwendwa, the executive director of the African feminist tech collective Pollicy, for a discussion on building more equitable digital ecosystems. The conversation touched on the importance of gender data, designing AI tools that enable civic engagement, and women’s participation in the development of new technologies.  


Your organization, Pollicy, takes a unique approach to thinking about data and their impacts. How would you describe Pollicy’s role in working toward better data governance? 

Pollicy is an African feminist civic tech collective of technologists, data enthusiasts, and academics. Our name comes from a combination of opinion polling and policy making. We work primarily at the intersections of data, design, and technology, and our mission is to improve government service delivery. We seek to enhance our stakeholders’ data literacy, encourage them to use data to inform decision-making more effectively, and foster appropriate data governance practices.  

While many organizations work on data technology from an academic or policy angle, at Pollicy, we center our efforts on communities. We synthesize tech jargon, material on AI, and data science findings into simple language that people can understand. This enables our audiences to better understand the role data plays in their lives and to advocate for better data governance policies. 

While many organizations work on data technology from an academic or policy angle, at Pollicy, we center our efforts on communities. We synthesize tech jargon, material on AI, and data science findings into simple language that people can understand. This enables our audiences to better understand the role data plays in their lives and to advocate for better data governance policies. 

You mentioned the role data plays in people’s lives. How would you characterize the relationship between data and public policy, and the significance of gender data in particular?  

Our work focuses on influencing a culture of responsible data use: We promote data governance practices and policies that support a democratic data ecosystem.  

While engaging with communities and listening to them, we found that public servants and community leaders have only a limited understanding of data. As one example, there are gaps in the uptake of gender data to influence and design policies that ensure gendered power dynamics are addressed. Apart from national statistics organizations, government agencies usually do not advance work using gender data—they just work with the general data that’s shared with them. Yet gender data really matters for ensuring that the needs of various communities are included at the level of decision making and policy formulation.  

How does representative data relate to the use and training of AI tools?  

We know that AI tools are not neutral and, so far, have replicated and magnified existing prejudices and biases that have long-lasting effects on our communities—particularly on women and other marginalized groups. If you look at how AI-generated synthetic media has been used in the wake of elections or during the recent protests in Kenya, it’s been used to abuse people as opposed to sharing knowledge.  

Beyond just the way these tools are being used, there are biases built into AI systems themselves. For example, if you prompt an image generation tool to depict an African country in the future with good roads, it will produce an image that may include a tarmac but lacks things like a pedestrian path or streetlights. But if one uses the same prompt for a city in the U.S., the tool creates an image of a perfect, almost utopian city. You can tell that the data feeding the algorithm still falls short of creating a full and accurate picture, and that localization and contextualization are needed. 

Ultimately, AI systems reflect the data on which they are trained, and biases permeate into these technologies because they are so heavily reliant on data to operate. Too often, we see designers or developers at big tech platforms create tools and apps without questioning whether these tools account for the needs of marginalized communities. Will potential solutions or new tools and platforms connect people in culturally sensitive and constructive ways? Or will they exclude certain groups, directly or indirectly? When it comes to AI tools in the public sphere, understanding cultural dynamics is critical, and gender data is one piece that can help to ensure equitable outcomes and feed into tools that account for local practices and customs. 

Ultimately, AI systems reflect the data on which they are trained, and biases permeate into these technologies because they are so heavily reliant on data to operate.

Could you elaborate on how a lack of representative AI could impact democracy? 

Going back to the example of an AI-generated image of a city, let’s say you ask for an image of a village in Kenya or a village in another African country. For certain models, one may find women depicted in these images or videos as people who do menial jobs. When African men are depicted in these videos or images, however, they are shown doing more high-status work. These results suggest that data has been fed into these models on the back end which is leading to a skewed depiction of African women. 

So let me take this to an election context: People are constantly creating content during election cycles. Some creators use these generative AI tools, for example, to make posters. When you rely on these tools for campaign posters, they reproduce the biases embedded in the models. So, you’ll sometimes look at AI-generated campaign posters and see they reduce women exclusively to the role of voters.  

In fact, there are some posters that will only show women in a crowd. The generated images never show a woman as an election official or political party candidate. They just show women at the last end of the chain. That’s a problem. These AI-generated images may suggest this role is the only role for women in politics. And what does that communicate to young people who are maybe voting for the first time? It’s simple. If they aren’t used to seeing women depicted as candidates, they are less likely to think seriously about supporting any woman who does run for office. We are visual beings. What we see is what we end up using to make our decisions. 

Even when regulators place restrictions on the use of generative AI during elections, people go outside these jurisdictions to hire service providers, who use these tools to depict women and youth in a negative light. And this is based on data that’s been taken from our region, taken into another market, used to feed tools that generate synthetic videos and fake images, and then shipped back online on apps, which candidates will use to intimidate, demoralize, and paint false pictures of their opponents—to limit their participation in public life. 

Given the challenges outlined above, what can be done to overcome barriers to representation in AI models? How should the technology sector adapt? 

I know many people will say that there’s a challenge in getting data in some contexts (like certain continents or countries, political jurisdictions, and markets). But at the same time, many widely used apps collect information that includes data from Africa. Companies have partnership deals in which they exchange data that is sourced from Africa, and the benefits to African societies are difficult to identify. Therefore, to be quite honest, many feel that we are being taken advantage of. There’s a lot of “data extractivism” that doesn’t benefit the source communities of that data. 

So, while the data exists, we lack the proper people in positions of power at tech companies to use this data in a constructive manner. And that’s why we’re always advocating for platforms to hire more Africans. Individuals with knowledge of local contexts can feed these models with more data that accurately reflects Africa (among other geographic contexts) and bring these technologies to the continent to build their own tools. There are more than 1,000 languages spoken across the continent and hiring by tech firms should reflect this diversity. 

In our work on Afro-feminist AI and data governance, Pollicy has advocated for principles that will help to safeguard values of agency, human dignity, privacy, and non-discrimination in the AI context. To achieve these goals, we need a fuller understanding by designers and policymakers of the actors, cultures, and networks that feed into data ecosystems, and more opportunities for diverse stakeholders—especially young people—to participate in data governance decisions. As technology advances, it needs to advance with us.  

Individuals with knowledge of local contexts can feed these models with more data that accurately reflects Africa (among other geographic contexts) and bring these technologies to the continent to build their own tools. 

What role can civil society play in promoting a more inclusive approach to data governance, data collection, and ultimately AI design? 

Regarding civil society engagement, research is number one. There is never—and will never be—enough research, because we always need to account for the lived realities of individuals and understand how they use these technologies.  

If we think about how fast-paced technological development is, the time to conduct research is now. It’s not later. It’s not when you have more money, or you have exactly the right skill set. You can start where you’re at. The notion that you need a technology or data science background to engage in this sector is harmful because it creates a bubble. Without such engagement, challenges to inclusion or shared ownership will persist.  

Second, civil society should promote awareness of these issues in local languages. More often than not, we all program in English, but there are indigenous languages that are underrepresented in this space. Civil society can create opportunities to engage in these languages to prevent their disappearance.  

Finally, in Africa (the continent with the youngest population globally), there are young people who still need to understand the power of tech and how it can serve the public interest. Civil society can bridge that gap so that people are thinking more consciously about how digital tools can serve the wider population. What do I mean by this? For instance, if you look at issues such as digital security, there are numerous tools that use generative AI to advise users on password protection and how to keep personal information more secure. These applications have not been adopted on the main platforms, but young people and innovators are finding ways to get the word out to encourage greater engagement on these issues.   

In Africa (the continent with the youngest population globally), there are young people who still need to understand the power of tech and how it can serve the public interest. Civil society can bridge that gap so that people are thinking more consciously about how digital tools can serve the wider population. 

Throughout this discussion, you’ve highlighted how important representation is. How might fresh thinking about data and AI help reduce barriers to women’s participation in our digital ecosystem? How can we build AI tools that encourage civic engagement, particularly among women? 

For the communities with which we have engaged, representation, whether that be in government ministries or ICT companies or schools, is critical.  

Are women teaching STEM courses? Are people being taught how to prevent biases from permeating new technologies? We found that going back to the fundamentals, starting with education systems up to the government and tech company level, is really important to bridge that gap in terms of how women are perceived.  

We also need to encourage participatory approaches that ensure everyone’s views are taken into account. What do I mean by this? I don’t mean the simple box-ticking exercises where people say, “bring the women into the room.” We must actually listen to them when holding design meetings or governance discussions and consider their contributions carefully. Often the actors who make these decisions do not reflect society’s broader diversity, even though their decisions (or tools they have created) will impact society writ large. So, another approach that we think is key to encourage women to build and to use AI tools to promote civic engagement. 

Lastly, we must ensure gender data uptake. We started with this idea, and I’ll close by saying that gender data uptake is invaluable because it draws out power dynamics. Designers, governments, and other stakeholders can use insights from gender data to resolve some of the persistent challenges facing women in the digital ecosystem, such as tech-facilitated gender-based violence, rather than having to go back to the women who have suffered and putting the burden on them to resolve the problem. Platforms can create tools that prevent abuse from reaching women in the first place. 

Gender data uptake is invaluable because it draws out power dynamics. Designers, governments, and other stakeholders can use insights from gender data to resolve some of the persistent challenges facing women in the digital ecosystem.

Irene Mwendwa is the executive director at Pollicy, where she was previously program manager for feminist movement building and advocacy. She is a legal professional with a strong track record in fostering innovative collaborations and spearheading projects that advance policy and legal frameworks, particularly in areas such as digital inclusion, elections and technology. With extensive experience in human rights and tech policy, Irene is deeply committed to advocating for a feminist internet. 

This interview has been condensed and edited for clarity by Maya Recanati and Beth Kerley. The views and opinions expressed within do not necessarily reflect those of the National Endowment for Democracy.

Read MORE “FORUM Q&AS.”


FOR MORE FROM NED’S INTERNATIONAL FORUM ON THIS TOPIC, SEE:

Leveraging AI for Democracy: Civic Innovation on the New Digital Playing Fielda report by Beth Kerley, Carl Miller, and Fernanda Campagnucci.

Digital Directions a curated newsletter sharing insights on the evolving relationships among digital technologies, information integrity, and democracy.

Big Question: How does Digital Privacy Matter for Democracy and its Advocates. Hear answers by Adrian Shahbaz, Andrej Petrovski, Lindsay Gorman, Thobekile Matimbe, and Elizabeth Donkervoort on the following questions: How does digital privacy matter for democracy and its advocates? In what ways does the collection of digital data create risks to your own work or that of other democratic activists?

Share