Victorian police have arrested a teenage boy from elite private school Bacchus Marsh Grammar, after deepfake sexually explicit images of about 50 female classmates were produced and circulated online.
As reported by the ABC, last Friday police were made aware of “a number of images” that were sent to a person in the regional Victorian area of Melton, close to Bacchus Marsh Grammar’s campus in Maddingley, west of Melbourne.
These images were allegedly AI-generated deepfake images, manipulating images of female students at Bacchus Marsh Grammar to appear explicit in nature. There were about 50 girls from years 9-12 who were victims of the alleged crime.
Speaking to the ABC, principal of the elite private school Andrew Neal described the images as “obscene”, condemning the “appalling” incident.
“(The female students) should be able to learn and go about their business without this kind of nonsense,” Neal told the ABC.
“These things are not funny… they are basically vicious and therefore they should be dealt with appropriately.
“It’s behaviour that needs to be dealt with in as firm a way as possible.”
Police have today revealed a teenage male student was arrested over the incident. The boy has since been released as “investigations remain ongoing”.
In a statement today, Bacchus Marsh Grammar said the school is “taking this matter very seriously” and have contacted the police.
“The wellbeing of Bacchus Marsh Grammar students and their families is of paramount importance to the school and is being addressed,” the statement reads.
A parent of a female student at Bacchus Marsh Grammar, who also works as a trauma therapist, said she viewed the images when she picked her 16-year-old daughter up from a sleepover on the weekend.
The parent, Emily, said she had a bucket for her daughter in the car, who was “sick to her stomach” from seeing the deepfake images.
“She was very upset, and she was throwing up. It was incredibly graphic,” Emily told the ABC.
“I mean they are children … The photos were mutilated, and so graphic. I almost threw up when I saw it.
“Fifty girls is a lot. It is really disturbing.”
In May this year, a 15-year-old student from Salesian College, a Catholic boys’ high school in Melbourne’s south-east, used AI to generate sexually explicit images of a female teacher, as reported by The Herald Sun. The boy was recently expelled from the school.
Deepfake pornography
According to research from 2021, 96 per cent of deepfakes created and circulated online are pornographic in nature. A report from the eSafety commissioner in 2017 found one in 10 Australians have been victims of image-based abuse, which includes the non-consensual creation of digitally altered sexualy explicit material.
Victoria is the only state in Australia to criminalise the sharing of deepfake porn. In 2022, the government made several significant reforms to the state’s sexual consent laws, including criminalising AI-generated sexually explicit imagery. Under the new laws, the offence of digitally superimposing a person’s face onto an intimate image without their consent is punishable by up to three years in jail.
Last week, Attorney-General Mark Dreyfus introduced new federal legislation criminalising the sharing of deepfake pornography.
“We know it overwhelmingly affects women and girls who are the target of this kind of deeply offensive and harmful behaviour. It can inflict deep, long-lasting harm on victims,” Dreyfus said.
“The Albanese Government has no tolerance for this sort of insidious criminal behaviour.”
Under the changes, the sharing of non-consensual deepfake sexually explicit material will carry a criminal penalty of six years in jail. If the person also created the image without consent, there will be an aggravated offence with a higher penalty of seven years in jail.
The new laws only cover deepfake images that depict adults. There are separate laws that cover the possession of sexually explicit images of children.