Sharing deepfake pornography will be criminalised under new laws

‘No tolerance’: Sharing deepfake pornography will be criminalised under new laws

Mark Drefus, Attorney-General of Australia

People who share deepfake pornography will soon face serious criminal penalties, with new legislation on the issue set to be introduced into federal parliament.

Attorney-General Mark Dreyfus will introduce new legislation on Wednesday, acknowledging that the non-consensual sharing of digitally altered sexually explicit material is a “damaging and deeply distressing” form of gender-based abuse.

“We know it overwhelmingly affects women and girls who are the target of this kind of deeply offensive and harmful behaviour. It can inflict deep, long-lasting harm on victims,” Dreyfus said.

“The Albanese Government has no tolerance for this sort of insidious criminal behaviour.”

Under the changes, the sharing of non-consensual deepfake sexually explicit material will carry a criminal penalty of six years in jail. If the person also created the image without consent, there will be an aggravated offence with a higher penalty of seven years in jail.

The new laws only cover deepfake images that depict adults. There are separate laws that cover the possession of sexually explicit images of children.

“The Government’s reforms will make clear that those who share sexually explicit material without consent, using technology like artificial intelligence, will be subject to serious criminal penalties,” Dreyfus said, who expects the legislation to be supported by the entire parliament.

Dr Asher Flynn, Chief Investigator and Deputy Lead at the Australian Research Council Centre for the Elimination of Violence Against Women, said the reform would address a major legal gap in Australia. 

“The laws may also go some way towards curbing the accessibility of sexualised deepfake technologies. If it is illegal to create or produce non-consensual deepfake imagery, then it would likely reduce the capacity for the technologies, like free nudify apps, to be advertised,” Dr Flynn said.

“It is important that these proposed laws are introduced alongside other responses which incorporate regulatory and corporate responsibility, education and prevention campaigns, training for those tasked with investigating and responding to sexualised deepfake abuse, and technical solutions that seek to disrupt and prevent the abuse.”

Dr Flynn also pointed to a survey from 2019 that found 14 per cent of respondents aged between 16 and 84 in Australia, the UK and New Zealand had experienced someone creating, distributing or threatening to distribute a digitally altered image representing them in a sexualised way. 

“People with disabilities, Indigenous people and LGBTIQA+ respondents, as well as younger people between 16 and 29, were among the most victimised,” Dr Flynn said. 

Dr Flynn also noted monitoring from Sensity AI has found around 90 per cent of non-consensual deepfake videos content since 2018 has featured women. 

“This year, we have had numerous reports of sexualised deepfakes being created and shared involving women celebrities (including Taylor Swift), young women and teenage girls.”

The announcement follows a crisis meeting of federal, state and territory leaders in May following national outrage over gender-based violence and domestic violence. Following the meeting, Prime Minister Anthony Albanese committed to introducing legislation to criminalise the creating and distribution of deepfake pornography.

Other measures announced by the Prime Minister include a review of the Online Safety Act and proposed changes to address doxxing.

×

Stay Smart!

Get Women’s Agenda in your inbox