Staff writer Mila Stricevic on the disturbing rise of deepfake pornography and its harmful impact on women’s safety.
One of the internet’s first encounters with ‘deepfake’ technology was a seemingly harmless viral video of Tom Cruise on social media. Despite the trivial nature of the video, even at the time serious ethical concerns were being raised about the implications of unleashing this technology into the world. Just two years later these concerns have been legitimised by the rapid circulation of deepfake pornography.
Deepfake technology allows people to generate pornographic material using only a person’s face. Victims of this technology are often women and girls, whose faces are inserted into violent, graphic porn which is so realistic that it can be difficult to tell it has been computer-generated. Deepfake pornography is a freely accessible and increasingly popular avenue of violence toward women.
While it is illegal in Scotland to distribute intimate videos, images, or other content without consent under the Abusive Behaviour and Sexual Harm Act 2016, deepfake technology uses ‘likeness’ rather than featuring the victims themselves, making it difficult to legislate against. Furthermore, those intent on creating and distributing deepfake pornography often operate anonymously and are therefore hard to catch.
The harmful consequences of unregulated deepfake pornography are immeasurable. Victims suffer life-changing consequences including psychological distress, sexual objectification, and reputational damage. In 2021, one deepfake pornography website received over 38 million visits in eight months. It’s clear this is not a hidden market.
As Megan Farokhmanesh pointed out in Wired magazine, much of the discussion around deepfake pornography deliberately downplays the serious harm it can cause. Whether or not the content is real makes little difference to the impact on the victim. Earlier this year, the subject came to light again, after Brandon Ewing, a streamer on Twitch, was caught with deepfake pornography featuring fellow female streamers. Among the victims, streamer QTCinderella said “even though it’s not my body, it might as well be. It was the same feeling – a violation that comes with seeing a body that’s not yours being represented as yours.”
Last month, a new report from the UK’s all-party parliamentary group (APPG) on commercial sexual exploitation demonstrated an inextricable link between the consumption of pornography and sexual violence against women. “What has become apparent during the course of this inquiry is that we cannot end the epidemic of male violence against women and girls without confronting and combating the contributory role that pornography plays in fuelling sexual objectification and sexual violence.”
The fact that sexual violence in pornography carries over into our streets is distressing but it should hardly come as a surprise. Figures released in 2023 by the non-profit organisation Common Sense found that 8 in 10 teenagers who watch pornography do so for educational purposes. Of those, over 50% said they had viewed graphic porn depicting rape or people in physical distress. How can we expect young men to value respect and consent when mainstream porn epitomises violence and objectification?
Violence against women is increasing. This will continue as long as sex education (or its absence) promotes aggression, sexual violence and rape.
But the online pornography landscape is changing. The APPG has recommended that all pornography websites verify consumer age and demanded that companies ensure anyone featured in pornographic content is of legal age and consenting. Glossy policy recommendations will not protect women and girls from the distressing effects of deep fake pornography.
And deep fake technology doesn’t just threaten women’s safety. There are legitimate concerns that it could be used for political gain, from interfering with elections to starting wars. This technology threatens the very notion of discernible reality and truth. Deepfake content is making it virtually impossible for any media consumer, from private citizen to government official, to tell fact from fiction.
Legislation passed by the European Parliament in 2022 focused on regulating consensual deepfake content. The UK parliament has announced intentions to add the criminalisation of non-consensual deepfake pornography to the Online Safety Bill. Sentiment is simply not enough.
With more and more of our lives sacrificed to the internet, we must continue to demand action from our government to tackle the dangers of unregulated, non-consensual deepfake technology. If we don’t, we are allowing our politicians to put the rights of Artificial Intelligence over those of its own citizens.