A new type of invasive porn is spreading on the internet. Using controversial technology, celebrities like Daisy Ridley, Natalie Dormer, and Taylor Swift have become unwitting porn stars, their faces realistically rendered into lewd positions and sex scenes, and you could be next.
Deepfakes, named after a Reddit user who started the trend, pair GIFs and videos with machine learning to convincingly paste one person’s face onto another person’s body. Anyone with a sufficiently powerful graphics processor and a few hours to kill (or a whole day, for better quality) can make a deepfake porn video. The app has already reached 100,000 downloads, and there’s a growing audience for fake celebrity porn on Reddit and 4chan.
The ability to turn anyone into a porn star, as long as you have enough high-quality images of their face, raises some serious ethical and practical questions. It’s hard to say whether deepfakes are legal, considering they touch unsettled areas of intellectual property law, privacy law, and brand-new revenge statutes that vary from state to state. Regardless, who’s willing to host this stuff? And is it possible to stop it?
Reddit’s r/deepfakes community has received the most attention thus far, thanks to the initial report by Motherboard on Jan. 24 and other press coverage, but posters there are also conflicted about the NSFW nature of their work. The outside attention has been mostly critical, calling out deepfakes as a heinous privacy violation and the beginning of a slippery slope toward fake porn of non-famous people. Revenge porn is already ruining lives and tying the legal system in knots, and porn-making neural networks could severely compound the issue.
Some on r/deepfakes have proposed splitting into two or more subreddits to create a division between those who want to advance facial recognition as a consumer technology and those who just want to jerk off to fake videos of their favorite Game of Thrones actresses.
“This should be someplace that you can show your classroom, or parents, or friends, for a tangible example of what machine learning is capable of, but instead it’s just a particularly creepy kind of porn,” wrote one poster.
“And I say that as someone who has had a kink for fake celebrity porn since 2001,” he added.
These rare flashes of conscience are the reason that anonymous posters on 4chan have argued that Reddit shouldn’t be the home of deepfakes. They feel it’s too liberal—too feminist and “social justice warrior,” as the troll parlance goes—to reliably keep the porn coming.
“[T]here needs to be a proper website to host this stuff regardless, reddit is full of sjws,” wrote one anonymous user.
Some of the earliest deepfake porn posts have already been removed from Reddit, forcing frantic users to break out their backup copies. (There are always backup copies.)
Complicating matters even further, the fakes are mostly hosted outside of Reddit itself. They were initially being uploaded to popular GIF-hosting site Gfycat, but the site took swift action to remove them. “Our terms of service allow us to remove content we find objectionable,” Gfycat told the Daily Dot via email. “We find this content objectionable and are actively removing it from our platform.”Gfycat doesn’t need to make that call, though—deepfakes violate the site’s terms of service, and they’re being taken down.
Deepfake posters then took to Pornhub, which is one of the largest streaming porn providers online and also allows community uploads. One Reddit poster, who claims to be based in Ukraine, uploaded 27 deepfake porn videos, including some that had been deleted from Reddit.
But while Pornhub has a low tolerance for “nonconsensual” porn like these celebrity deepfakes, it’s also a game of Whack-A-Mole, with new videos continuing to crop up.
“Regarding deepfakes, users have started to flag content like this and we are taking it down as soon as we encounter the flags,”, a Pornhub spokesperson told the Daily Dot. “We encourage anyone who encounters this issue to visit our content removal page so they can officially make a request.”
Redditors have started to post their deepfakes on another file-sharing site, SendVid, which keeps links private unless the uploader decides to pass them around. SendVid responds to copyright takedown notices, though, so it may only be a stopgap for Reddit’s porn fiends.
That gives 4chan, where porn of every imaginable stripe has thrived for years, the advantage in the bid to become the internet’s deepfake porn clearinghouse. The site hosts its own GIFs and video files, and its famously permissive content policies mean they’re unlikely to be taken down without a legal threat. It’s not yet clear whether such threats are coming. Gfycat declined to say whether it had received any takedown notices for celebrity porn deepfakes.
The main challenge of hosting porn on 4chan is that threads there expire after a time limit. Several deepfakes threads have come and gone, but anyone looking for specific content will have to post a request and hope someone saved it. Right now, it looks like deepfakes are destined for collectors’ hard drives. From there, they might be periodically reposted to 4chan or packaged as ZIP and RAR files on sharing sites like Mega and Mediafire, or shared via BitTorrent trackers hosted outside the U.S.
When it comes to ill-gotten nude photos, though, 4chan is relatively tame compared to forums like anon-ib, the image board connected to the 2015 “Fappening” celebrity hacking incident. The site is dedicated to scoring “wins”—a.k.a. nudes— of everyone from celebrities to cam performers to cosplayers. Users can post requests for “wins” or revenge porn of any woman, and other posters often deliver. It seems like a natural home for deepfakes collectors.
Another alternative is private rooms on Discord, the Slack-like group chat app favored by gamers and various organized troll groups. Motherboard reports that some of the most disturbing deepfake experiments—starring people’s friends and classmates—are already being shared in Discord chatrooms.
Like the real celebrity nudes that leaked during the Fappening, deepfakes might end up relegated to the seediest, most pop-up-ad-ridden corners of the internet, but it’ll be extremely difficult to get rid of them altogether.