Reddit bans pornography using the faces of celebrities
Reddit this week banned “deepfake” pornography — fake celebrity porn videos created with face-swapping technologies — and shut down the community of the same name, becoming the third major internet platform this week to crack down on the increasingly popular clips.
“This community has been banned,” the former deepfakes subreddit page reads. “This subreddit was banned due to a violation of our content policy, specifically our policy against involuntary pornography.”
The site also updated its rules to prohibit sexually explicit photos and images “that have been faked.”
The move comes after Pornhub prohibited deepfakes, saying the videos amount to nonconsensual content or revenge porn that violates the porn megasite’s terms of service. Twitter followed suit, vowing to suspend any accounts posting such content. The gaming site Discord and the GIF creator Gfycat banned deepfakes in January.
It doesn’t take a sophisticated production studio to produce face-swap porn, which is a major reason deepfakes have proliferated online in recent months.
Deepfake makers create their videos using a patchwork of readily available technologies, as Vice’s Motherboard has explained in detail. Often, they use open-source social media tools to download photos of victims en masse. Once they have enough face pictures of the targeted celebrity to work with — it typically takes a few hundred — they look for a suitable porn performer’s body to graft them onto. In this sense, performers are victimized, too.
The vast majority of deepfakes available online feature celebrities. But the implications for private individuals are the same. In a few clicks, all the photos in a person’s Instagram account could be downloaded, fed to the right software and turned into fake porn — or something nonpornographic but equally embarrassing or incriminating.