Anthony Rotondo finded $343,500 for deepfake image abuse

Anthony Rotondo fined $343,500 for deepfakes of six prominent Australian women

deepfakes

Anthony Rotondo (Antonio) has been fined $343,500 by the Federal Court for posting deepfake images of several high-profile Australian women. 

The civil action was launched by the eSafety Commissioner, Julie Inman Grant, with the court also ordering Rotondo to pay her legal fees on top of the fine. 

The first case of its kind in Australia, the civil judgement was handed down by Justice Erin Longbottom on Friday afternoon in the Federal Court of Australia in Brisbane. 

Rotondo was found to have violated the Online Safety Act when he posted deepfake images of six different women to the website MrDeepFakes.com, which has since been shut down.

“I accept that [Rotondo’s] contraventions of the act were serious, deliberate and sustained,” said Justice Longbottom.

“They involved the respondent posting 12 non-consensual deepfake intimate images, depicting six individuals, across a six-month period to an online platform that was accessible worldwide.”

Commissioner Julie Inman Grant launched the civil action against Rotondo in 2023 after he replied to a removal notice, saying it meant nothing to him as he wasn’t an Australian resident. 

In her judgement, Justice Longbottom said Rotondo had failed to comply with the removal notice, meaning two images had remained on the website for almost a year. He had also “admitted liability” to the court earlier this year and quoted him describing the production of the deepfakes as “fun”.

Days after he received the removal orders he forwarded the documents to eSafety Commission staff and 49 other email addresses, including media outlets.

Later, Rotondo provided his password to investigators, who removed the images from his website. 

The court has suppressed the names of the women targeted by Rotondo to protect their privacy.

A strong message

The eSafety Commissioner has said the civil penalty against Rotondo sends a strong message about consequences for anyone who perpetrates deepfake image-based abuse, which can cause significant psychological and emotional distress for the victims. 

In June 2021, Parliament passed the Online Safety Act 2021, to keep Australians safe online, including a new scheme to address severe online abuse of adults. The Act commenced on 23 January 2022.

Earlier this month, eSafety launched enforcement action against a tech company responsible for AI-generated ‘nudify’ services used to create deepfake sexualised images of Australian school children.

This month, eSafety also issued a formal warning to a UK-based tech company for enabling the creation of child sexual exploitation material, including explicit deepfake images of students in Australian schools. 

Australians who have experienced image-based abuse (the non-consensual sharing online of intimate images, including deepfakes) are encouraged to report it. According to eSafety, allegations of criminal nature should be reported to local police and then to eSafety.gov.au

×

Stay Smart!

Get Women’s Agenda in your inbox