Non-Consensual AI Porn Skirts Privacy Rules, Still Wrong
This piece, originally from The Conversation and republished by Mirage News, argues that new media tools quickly get used for pornography and that generative AI has intensified the creation and spread of deepfake porn, making it easier to fabricate explicit images or videos of real people. It notes the harm is not limited to celebrities, and points to reports of deepfake sexual content involving classmates and teachers, including incidents affecting children in schools.
The authors then examine what the law covers in Australia. They say Australia amended its Criminal Code in 2024 so that AI generated porn falls within the legal framework that bans distributing sexual material of others without consent. But they highlight gaps, especially that the key offence focuses on transmitting or sharing, while creation itself is not clearly a standalone offence. They also describe legal uncertainty about whether using online AI services to generate the content counts as transmitting in the required legal sense, and they question how courts will interpret the mental element of recklessness about consent.
The article argues that even if private creation is not clearly illegal, legality is not the same as ethics. It suggests law alone often does not change online behaviour, and says many people still feel that privately making deepfake porn is wrong, even if it is hard to explain why it is different from having private sexual fantasies.
A major part of the piece critiques the common claim that deepfake porn is primarily a privacy violation. The authors say deepfakes can look convincing, which is why victims can feel exposed, but they argue these outputs do not actually reveal personal, target specific sexual information. Instead, they are largely generic bodies combined with a recognisable face, and the mistaken idea that such tools provide true access to someone’s real body should not be reinforced.
Even so, the authors conclude the practice remains morally wrong. They argue people have a strong interest in how they are depicted, that convincing sexual depictions can distort how others see a person, and that many targets experience severe psychological and emotional harm. On that basis, they condemn the private use of these tools as disrespectful and unjustifiable, regardless of whether it fits neatly into privacy law categories.





