Demand for deepfake pornography is exploding. We aren’t ready for this assault on consent
With cheap apps proliferating, how long til our likeness appears in a nonconsensual deepfake porn video?
In the ad, a woman in a white lace dress makes suggestive faces at the camera, and then kneels. There’s something a bit uncanny about her; a quiver at the side of her temple, a peculiar stillness of her lip. But if you saw the video in the wild, you might not know that it’s a deepfake fabrication. It would just look like a video, like the opening shots of some cheesy, low-budget internet porn.
In top right corner, as the video loops, there is a still image of the actress Emma Watson, taken when she was a teenager, from a promotional shoot for the Harry Potter movies. It’s her face that has been pasted on to the porn performer’s. Suddenly, a woman who has never performed in pornography is featured in it.
The ads, which directed users to an app that makes deepfake videos, were discovered in more than 230 iterations across Facebook, Instagram and Meta’s Messenger, according to an NBC news investigation by Kat Tenbarge. Most of the ads featured Watson’s image; some others used the face of Scarlett Johansson. The same ads appeared in photo-editing and gaming apps available in the Apple App Store. Lest the message be lost on viewers, the ads make explicit that they are intended to help users create non-consensual porn of any women they like. “Swap ANY FACE in the video!” the ads read. “Replace face with anyone. Enjoy yourself with AI face swap technology.”
Similar ads for deepfake services appear directly next to explicit videos on PornHub. Though deepfake technology can theoretically be used for any kind of content – anything from joking satire to malicious political disinformation campaigns – overwhelmingly, the tech is being used to create nonconsensual porn. According to a 2019 report, 96% of deepfake material online is pornographic.
That figure might well be increasing. The ads on Meta and Apple platforms appeared as consumer demand for deepfake pornography is exploding. The surge comes on the heels of a controversy that rocked online video game communities in January, when a popular streamer, Brandon Ewing – who calls himself “Atrioc” – displayed deepfake pornography of several popular women streamers in one of his online broadcasts. He later admitted to having paid for the artificial porn of the women, who were his colleagues and friends, after seeing an ad similar to those that appeared on Meta and Apple platforms. The women whose images were commandeered for Ewing’s pornography issued angry and hurt responses; Ewing himself apologized. But the controversy seems to have only made the streamer’s overwhelmingly young and male follower base more aware of the availability of deepfake content – and eager to use it themselves.
Genevieve Oh, a researcher who studies livestreaming, told NBC that after Ewing’s apology, web traffic to the top deepfake porn sites exploded. That rapid increase over the past few weeks has followed a slower, but still alarming, growth of the deepfake revenge porn sector over the past several years. In 2018, fewer than 2,000 videos had been uploaded to the best-known deepfake streaming site; by 2022, that number had ballooned to 13,000, with a monthly view count of 16m. As deepfake revenge porn becomes more popular, the barrier to access is quite low: the app that misused Watson’s face in its ads charges just $8 per week.
The rapid increase in the number and availability of nonconsensual deepfake porn videos raises alarming questions about privacy and consent in the digital future. How will the huge numbers of women – and the smaller, but significant, numbers of men – who are affected by this new AI-enabled revenge porn manage their reputations and lives? As the technology improves, how will viewers know the difference between fact and AI-generated fiction? How can nonconsensual material be removed when the internet moves so much faster than regulation?
But the example of these apps – and of the men, like Ewing and his fans, who use them – also illuminates something older, and more uncomfortable, about the nature of porn: that men often use it as an expression of their contempt for women, and feel that the sexual depiction of women degrades and violates them. This is, in fact, much of mainstream porn’s appeal, at least according to the sentiments of many of the men who consume it: that it enables men to imagine themselves in control of women, and of inflicting pain and degradation on them. Deepfake revenge porn, then, merely fulfills with technology what mainstream porn has offered men in fantasy: the assurance that any woman can be made lesser, degraded and humiliated, through sexual force. The non-consent is the point; the humiliation is the point; the cruelty is the point.
There is no other way, really, to understand deepfake pornography’s appeal: it is not as if the internet lacks sexual content depicting real and consenting adults. What these apps offer their users is specifically and explicitly the opportunity to hurt women by forcing them into pornography against their will. After Ewing exposed his deepfake pornography to his streaming audience in January, one of the women depicted issued her own tearful video, describing how the malice and violation of the deepfake had wounded her. In response, a man sent her a picture of her own crying face appearing on his tablet. The screen was covered in semen.
For now, the women and others who are targeted by deepfake revenge porn have few avenues of legal recourse. Most states have laws punishing revenge porn, but only four – California, New York, Georgia and Virginia – ban nonconsensual deepfakes. Companies hosting the apps are often based overseas, mostly beyond the reach of legal enforcement – the company whose app was advertised on Meta appears to be owned by a parent company based in China. Meanwhile, more and more men will begin to use the technology against more and more women. “I was on fucking Pornhub … and there was an ad [for the deepfake site],” Ewing said in his apology video, by way of explaining how he discovered the AI revenge porn site. “There’s an ad on every fucking video for this so I know other people must be clicking it.”
Moira Donegan is a Guardian US columnist