'Deepfakes, doxxing, death threats': The cost women pay online

‘Deepfakes, doxxing, death threats’: The cost women pay to be online

Julie Inman Grant online

It’s been almost nine years since I took the helm at eSafety — a role that’s required me to walk the precarious line between hope and harm, visibility and vulnerability. For coming up to three thousand days, I’ve had a front seat to how technology is harnessed to transform and to traumatise.

The role of eSafety Commissioner might be described by some as one of the most energising and taxing roles in the Australian Public Service. Energising, because this is an extraordinary moment in human history —where digital tools connect us, inform us and empower us in ways that previous generations could only dream of. Taxing, because the architects of this hyper-connected world remain, too often, wilfully blind to the harms they unleash due to rushed and poor design choices for the sake of profits.

This year, UN Women is shining a light on digital violence for ‘16 Days of Activism against Gender-Based Violence’. As we mark the start of this global campaign, I want to reiterate something simple but vital: online harms are real harms.

The stories myself and my investigators hear every day from women and girls are sobering proof that technology-facilitated gender-based violence is taking an immense toll on our daughters, sisters, wives, mothers and friends. From ‘nudify’ sites that undress you instantly to faceless mobs calling for your death or rape, no woman is immune.

As eSafety Commissioner, I’ve experienced the three Ds of acute online abuse myself: deepfakes, doxxing and death threats. The harassment directed at women is more sexualised, more personal and more derogatory than abuse directed at men. Its intent isn’t simply to criticise or disagree — but to diminish, humiliate and silence, whether you’re a prime minister, parliamentarian or member of the press.

This isn’t good enough. While we may have regulatory tools to remediate harm after the damage has been done, what we really need is for this next technological chapter to be characterised by Safety by Design.

Seeing the gendered patterns before they hit

I’ve sought to take a proactive and anticipatory stance to forecast how technologies might be weaponised against women. In this role, I’ve had the dubious privilege of spotting patterns of gender-based violence online before they reached public awareness.

For example, more than five years ago, we warned about the current wave of deepfake image-based abuse. We predicted the surging tide of technology-enabled coercive control: a creeping, insidious form of harm designed to ensnare its victims without detection. We also cautioned how generative AI would, and now is, being used to produce so-called synthetic child sexual abuse material by harvesting images of real children from across the internet.

And last, but far from least, we’ve long vocalised our concerns over the volume of high-impact online pornography and its relationship to sexual violence, including child-on-child sexual harms. Research suggests that pornography and other online media are contributing to greater awareness and willingness to engage in strangulation. One 2024 Australian research study found that 61.3% of 18- to 35-year-olds surveyed had viewed strangulation porn online.   

At this point, my attention is back on AI and its convergence with immersive worlds. While public debate has rightly revolved around the many harms of generative AI, the sensory, embodied experiences of extended, virtual and augmented reality deserve urgent scrutiny. They’re already being exploited to simulate sexual assault — even ‘meta rape’.

Beyond AI: The immersive frontier

Today, we published our latest blog in the Tech Trends Converge series exploring the intersection between immersive environments and technology-facilitated gender-based violence. On these platforms, interactions blend movement, voice, gesture, text and images — a multi-sensory mix that reporting and moderation systems used on 2D platforms aren’t designed to handle.

Research shows that when a person’s avatar is touched or crowded, their body can register the sensation as if it were happening to their physical body. Like real-world trauma, the amygdala-hippocampus circuit is activated in the metaverse, triggering the body’s threat response and encoding the event as real trauma – not mere ‘digital play’. In other words, the psychological and physiological impacts are real.

Encouragingly, some platforms are beginning to act. We’re seeing gesture-based blocking (which provides alternative options to mute or block content instead of needing to navigate to a menu), alongside in-platform recording to capture the last few moments of a user’s experience in case evidence is needed, and personal boundary settings that automatically create space between avatars.

But this must go further. We need more scientific research and testing to genuinely embed Safety by Design in immersive environments, with particular focus on reducing acute forms of tech-facilitated abuse. These hyper-realistic and hyper-sensory environments need to build upon models of affirmative consent and allow for decisions that are discrete, granular and layered.

Without a serious design response, even overhaul, tech companies risk recreating the same exclusions that many have long faced offline: a world hostile, even closed off to, women and girls.

Big Tech and the ‘tech bro’ culture

Sadly, it’s no surprise that AR, VR and XR were not designed to anticipate the unique forms of abuse targeting women and girls. Our own research revealed just over 1 in 10 (11 per cent) of young gamers had seen or heard other players expressing or sharing misogynistic ideas relating to men’s superiority over women.

The technology sector remains one of the most gender-imbalanced sectors in the world and gaming has long been known for its corrosive invective (remember 2014’s ‘gamergate’?).  Most games developers may not be able to tap into the lived experience of gender-based violence.

Recognising and addressing the gendered manifestations of abuse will not only be beneficial for women and girls but can lift safety standards and well-being for all segments of the community. Part of the equation of online gender-based violence is that society continues to peddle narrow and outdated ideas of what it means to be a man, with a dearth of positive role models for young men to look up to (issues extensively explored in our conversations with young men and youth-serving practitioners in our ‘Young men online’ qualitative research series).

A call for design leadership

As we enter 16 Days of Activism, one truth is clear: while technology may be neutral, the human forces weaponising them are not. The design of these services reflects the values and blind spots of those who build it, leaving it vulnerable to bad actors motivated to exploit technologies for their own ends. The task of all tech enthusiasts is not simply to contain harm but to re-engineer systems that prioritise safety, balanced with a broader range of human rights from the start.

We can’t regulate our way to safety alone (though eSafety is doing its best with the world-leading regulatory tools we have).  We must design for it, embedding the digital safeguards from inception through every phase of the AI and metaverse design cycle.

Visit the eSafety website to find more advice and webinars to help protect women and girls online.

Feature image: eSafety Commissioner Julie Inman Grant.

×

Stay Smart!

Get Women’s Agenda in your inbox