The furor around deepfakes, porn videos that use machine learning to convincingly edit celebrities into sex scenes, has largely died down since many hosting sites banned the clips months ago. But deepfakes are still out there, even on sites where they’re not technically allowed. Popular streaming site PornHub, which classifies deepfakes as nonconsensual and theoretically doesn’t permit them, still hosts dozens of the videos.
BuzzFeed’s Charlie Warzel wrote on Wednesday that he’d found more than 100 deepfake videos on PornHub, and they weren’t particularly well-hidden. Searches like “deepfake” and “fake deeps” brought up dozens of clips. Several of the deepfakes he highlighted have since been taken down.
When deepfakes first made the news back in February, a PornHub spokesperson told the Daily Dot that the videos were against the site’s terms of service.
“Regarding deepfakes, users have started to flag content like this and we are taking it down as soon as we encounter the flags,” he said.
It appears users haven’t been all that vigilant in flagging the videos, especially now that deepfakes have been banned from Reddit and have faded from the headlines. PornHub could take a more proactive approach to combing the site for deepfakes, but that comes with costs: It would take additional employee time and company money, and could also add legal complications if a celebrity ever decides to sue over a video.
PornHub VP Corey Price told Engadget the same thing PornHub has been saying all along: It will continue to remove any deepfake video “as soon as we are made aware of it.”
And it does seem to be acting on that promise: After the BuzzFeed article, all PornHub’s “deepfakes” search results were gone. However, fake celebrity porn (of Mila Kunis and Jennifer Lawrence, among others) was still available using other search terms.