What media and civil society leaders can do to mitigate technology-fueled misogyny in the 2024 elections

What media and civil society leaders can do to mitigate technology-fueled misogyny in the 2024 elections

By
Elayne Deelen and Katya Vogt

 

AI-generated photo of a woman

 

In 2024, elections will take place in multiple countries with a collective population of 4 billion people; nearly half the world’s population. The campaigns for these elections will occur in digital spaces transformed by artificial intelligence (AI), which will amplify and accelerate information threats that already abound during election cycles, given the volume of interests competing for voters’ attention.  If left unchecked, AI can wreak havoc on elections by generating and mimicking voice, video, images, and text, and by rapidly spreading manipulative information. IREX is well positioned to support this priority through our media, technology, and gender equality and social inclusion work.

It is imperative that the threat and impact of technology-facilitated gender-based violence is not overlooked. The information threats that comprise this type of violence include gendered disinformation; cyber harassment; hate speech; stalking and doxing; and many others. Technology-facilitated gender-based violence is a pressing global threat to health, safety, and political and economic wellbeing—not just to those who experience it, but to society as a whole. It exacerbates other forms of harm directed at women, girls, and LGBTQIA+ persons based on racialized ethnicities, caste, [dis]ability, and other intersecting identities, and is rapidly increasing in form, frequency, and sophistication, in no small part because of AI.

Nearly 40% of women globally have experienced technology-facilitated gender-based violence and 85% of women have witnessed abuse against other women. Women in public facing roles, including women in politics, women journalists, and women human rights defenders, are at higher risk of attack. This systematic silencing of women in public spaces—whether through censorship and self-censorship, a reduction in civic and political participation, or forced or voluntary withdrawal from public leadership roles – is known informally as the “chilling effect”. Illustratively, one study found that over 80% of women parliamentarians surveyed had experienced abuse on social media, and nearly a third indicated that this had undermined their ability to effectively fulfil their mandates and express their opinions. In addition to the toll on those experiencing and witnessing this abuse, this silencing of women’s voices during election cycles undermines pluralism and democracy, and exacerbates polarization and digital authoritarianism.    

Thus, technology-facilitated gender-based violence in elections is more than violence against women. It is an early warning of instability, violence, and democratic backsliding and a red flag for radicalization. Unethical political players and malign regimes can use AI to target their opponents, alter the course of elections and democracies, push authoritarian agendas, and fuel divisive public debate. With an increase in the volume of toxic gender narratives and abuse, these narratives could become normalized, making it more acceptable for bystanders to share.

AI has known weakness due to issues of underrepresentation in data sets and bias in design and training data. As open-source AI models can be used to produce fast, human-like fakes at negligible cost and at scale, they can be used to create false content and manipulate elections. Here are some ways we can reduce its impact on women around the world:

Equip organizations and institutions to support women targeted online

We should be prepared to mitigate coordinated attacks on women candidates, including AI-augmented cybermob harassment enabled by armies of automated accounts and “swarming”, whereby positive or constructive comments on social media are drowned out by deceptive demands from fabricated constituents promoting radical agendas. We may also see increased doxing enabled by advancements in malware creation and hacking that can threaten women’s psychosocial and physical security. To prepare for these attacks, civil society can support institutions to establish funds and develop legal and digital, psychosocial, and physical safety resources to support female candidates targeted with abuse.

Raise awareness of toxic campaigns that discredit women candidates and enforce accountability for those who perpetrate them

More effective micro-targeting and personalization of campaign narratives targeting female candidates will seek to manipulate voters and push women out of politics. These campaigns will contain an increasing volume of misogynistic content targeting women, much of it sexualized, with women with intersecting identities being targeted at higher rates. To address this issue, civil society should create initiatives to raise awareness of voters, including prebunking toxic narratives and building demand for campaigns conducted with civility and respect. We should also incentivize political parties to sign commitments to ethical campaigning, with a specific code of conduct towards female candidates, and advocate for commitments to marking AI-generated content in campaigns and elections coverage. To increase election security and protect democracy, we should equip election officials with skills to recognize gendered disinformation and deep fakes. IREX’s USAID-funded Transform Digital Spaces to Reflect Feminist Democratic Principles (Transform) program supports local civil society to strengthen competencies to address technology-facilitated gender-based violence —including through awareness raising, digital security, and lexicon development to improve content moderation.

Increase and improve content moderation efforts to expose AI manipulations 

Media organizations and fact-checkers will have to contend with more sophisticated manipulative content, including photorealistic images and interactive deepfakes, that will make it increasingly difficult to quickly debunk it as fake, as well as with a proliferation of generative AI-created fake news sites providing fictitious elections coverage. The media and fact-checking community can address these threats by exposing AI-based manipulations and verifying election officials’ accounts and sources of election information. IREX’s flagship media and information literacy approach, Learn to Discern, helps build critical skills—including information verification and discernment—for individuals to safely navigate the information ecosystem and recognize and reject manipulation and hate speech.

Collaborate with civic technology organizations to develop tools that support survivors

In-model multilingual capabilities will make it easier to quickly create and disseminate hate speech in many languages, while text-based content moderation will become increasingly insufficient given the volume of video and image-based content. A lack of contextualized moderation will also result in the take down of survivors’ content—silencing them instead of the perpetrators. Civil society can help create lexicons of gendered and coded abuse and educate content moderators on how to use them in a survivor-centered way. Civil society can also collaborate with civic tech on tools that detect AI-generated activity and flag it in election contexts, including in languages other than English.

Importantly, we must also work to increase and mainstream our understanding of AI – how and what it does and threats and opportunities it presents – to demystify AI and support the application of AI tools to redress technology-facilitated gender-based violence and safeguard election integrity. This year, we can advance the ethical use of AI in democracy and governance, articulating needs and prioritizing training AI with inclusive data and tools in underrepresented languages. This is also an opportunity to develop education and career opportunities for women in AI, as today only 6% of software developers and 12% of AI researchers are women. Successful project models include IREX’s U.S. State Department funded SHE’s GREAT! program that equips girls and youth with gender equality, and social inclusion as well as science, technology, engineering, arts/design, and math skills to transform gender stereotypes and develop young women's digital leadership.

As AI becomes more advanced and accessible, it is likely that attacks against women candidates, journalists, and others in the public sphere increase in frequency, form, and sophistication. However, we should not treat this outcome as acceptable or inevitable. We can invest in preventing and mitigating the effects of technology-facilitated gender-based violence for more pluralistic, democratic elections in 2024 and years to come.