When social media first appeared, many embraced it as a great equaliser, but it soon became apparent that the technology, far from being neutral, contained and even magnified the biases of its creators, particularly as regards gender bias and misogyny. The 2014 Gamergate scandal was a missed signal that much technology preferred engagement over ethics. The current controversy around the Scarlett Johansson soundalike AI chatbot perfectly illustrates how AI can amplify existing gender bias. As AI spreads rapidly, we must be clear that we are not simply dealing with technology but with a complex instrument that can shape society – and that critical thinking and the humanities can provide a vital counterbalance.
The rapid expansion of social media has transformed how we interact publicly by opening up vast new avenues for engagement, while also revealing deep-seated inequities across those same social media platforms. Gender-based online violence has become pervasive over the past 10 years and can be seen in the large-scale harassment of women in public roles on social media, particularly those in journalism or politics. The widespread normalisation of misogyny and abuse mirrors and perpetuates systemic inequalities and prompts questions about the obligations of social media platforms to address such gender-based violence, as they – and not the individuals being targeted – are best positioned to prevent and mitigate abuse across their platforms.
The controversy over the Scarlett Johansson soundalike AI chat bot highlights a broader issue of gender bias, where female voices and identities are exploited without consent – even to imitate them – and underscores a systemic issue where the rights and permissions of individuals, particularly women, are not adequately protected. As new technologies such as generative artificial intelligence become more commonplace there are pressing questions as to how technology companies can address the systemic inequalities that they reflect and reinforce, and what we can learn about doing more to tackle gender issues within technology.
Looking back now, the 2014 online abuse campaign Gamergate highlighted the extent of the entrenched misogyny and served as a crucial missed signal about the inherent biases and gendered nature of social media spaces, while also illustrating the critical need to include the humanities in the development of new media technologies. As we face a world shaped more and more by technology, it is imperative that we learn from the mistakes of the past and integrate issues of ethical media into new technologies so that we can build more equitable and thoughtful technologies that consider the diverse needs and safety of all users.
Working at the Irish Independent in Dublin some 20-plus years ago I briefly assumed the position of letters editor, a gatekeeping role which included sorting through the sacks of letters delivered to the offices at Middle Abbey Street and deciding which missives were fit to print. Back then, those sealed letters were the only vehicle for readers to make contact with the newspaper or talk about its journalism. Fast forward 25 years and readers can – and do – use all sorts of public social and digital tools to interact with journalists, thus bypassing any human editorial control.
The ability to bypass editors was initially perceived as a net good, as researchers suggested that the addition of newer voices, particularly those from traditionally marginalised communities could enhance the quality of public discourse. This early idealism about rational users operating in a media technology utopia would in fact shape much of the understanding of social media platforms, creating an optimistic vision that dominated the academic literature over the past 15 years.
However, that optimism and idealism are in short supply now as those same social media platforms have become increasingly unsafe for women or people from under-represented communities, particularly those working in politics and journalism. A 2020 study from UNESCO reported that 73% of female journalists were targets of online abuse, while a 2024 Irish government report found that “abuse in political life is prevalent, problematic and is targeted disproportionately at women and minority groups. Online abuse is intensifying and becoming normalised, fuelled by the anonymity provided by digital platforms, and often driven by misogyny, sexism, racism and intolerance.”
Indeed, as Ireland prepares to vote in local elections on 7 June, the AILG (Association of Irish Local Government) has reported that social media is the main source of safety concerns for elected councillors, with female councillors eight times more likely to be targeted for abuse than their male counterparts. While these figures are clearly shocking, they also highlight the dual burden on women and members of under-represented communities who are compelled to maintain public profiles in these hostile environments if they want to run for public office. Moreover, new research suggests that these high levels of toxic discourse create lucrative revenue streams for the social media platforms, which underscores the difficulties in so much of the so-called public sphere taking place in spaces that are neither regulated nor properly critiqued.
Perhaps we should have paid more attention to Gamergate, the 2014 campaign of harassment against women in the predominantly male online gaming community, which now looks like a crucial missed signal in understanding the influence of technological design on gender norms and values. The campaign of online hate, driven under the guise of “just asking questions” by people who purported to be protesting so-called unethical games journalism — though the ethics claims were spurious at best and most of the women targeted were not even working in journalism – underscored the entrenched belief that gaming was a male preserve and that women should not challenge this traditional space. The harassment, which included threats, heckling and doxxing (where personal documents are released online in an effort to discredit and humiliate), spread rapidly across Twitter and Reddit, where the algorithmsescalated the harassment, boosting sensational content to maximise engagement and sharing, all carried out in the absence of human moderators or indeed any ethical safeguards.
The way in which the algorithms promoted the inherently biased content also suggests technological rationality – a 1941 concept from Marcuse, which holds that technology embeds and perpetuates the ideological values of its creators. At a minimum, Gamergate showed how deeply we had been misled by enthusiastic research that had wildly misread the potential for online harm, and failed to understand how technological structures, rather than being neutral, often reinforce societal biases, particularly gender biases. The male-centric biases in the gaming community were not only reflected but amplified by technological designs that favoured engagement over ethical considerations, promoting and amplifying misogynistic content.)
Gamergate could have served as a clear indicator of the urgent need for a deeper examination of the role of social media platforms in shaping cultural and social dynamics, particularly around gender and power. However, the media’s inconsistent coverage of the harassment versus the ethics debate muddied the public understanding of these issues, leading to a missed opportunity to address these biases at the time.
The issues are now pressing. The problems associated with toxic users and “dark participation” – are such that they are now having a chilling effect on public debate and are jeopardising the prospects for gender equality and democratic ideals. But recognising that social media spaces are inherently gendered, we need to consider issues of power, gender, and media if we are to address and transform the inequalities they create.
While media and technology have been intertwined since the emergence of moveable type, the scale and pace of the change wrought by these newer media technologies has been “nothing short of earth-shattering”and forces all who work in the media technology to think far beyond the practicalities or “just setting up shop” of learning .
Therefore, studying past events like Gamergate is about understanding and learning from history to inform future technology development such as generative AI, which looms ever larger. Perhaps by looking back to Gamergate we can look forward to ensuring an approach that includes the humanities, to better guide AI development and avoid the the mistakes of the past. Gamergate should have been a watershed moment for re-evaluating the impact of gender on digital interactions and the responsibilities of those who oversee and participate in these spaces. Instead, it stands as a lesson on the consequences of underestimating the deeply ingrained biases that media technologies can perpetuate and amplify. In learning from Gamergate, we should treat AI not just as a technological innovation but as a complex social instrument that holds the power to shape societal values and norms, for better or worse. Moreover, those of us who teach and influence the next generation within the media technology space carry a significant obligation to impart these lessons.
While we will never return to the days of the human at the heart of the publishing process it is imperative that we keep the humanities at the heart of media technologies. The letters editor may be a thing of the past, but the role of critical thinking around new media technologies is more crucial than ever. The recent controversy surrounding the Scarlett Johansson soundalike AI chat bot highlights the urgent need for ethical oversight and human-centred considerations in the deployment of these technologies.
Profiles
Kelly Fincham is a distinguished journalist, academic, and media expert. She has a PhD in Media and Communications (2022) and a PgCert in Higher Education Teaching (2023). With an impressive career spanning more than two decades, Kelly has become a leading authority on the impact of digital technologies on media practices and the evolving landscape of the hybrid media system.
Recently, Kelly collaborated with See Her Elected and the Association of Irish Local Government ahead of the 2024 local elections to share actionable insights from her research on strategies to mitigate unwanted online interactions. This work builds on her research presented at the biennial Future of Journalism conference at Cardiff University in 2023 and the annual conference of the Political Studies Association of Ireland in Belfast in 2023.
Currently, Kelly is delivering workshops and seminars on online safety practices for women working in journalism, academia, and politics.