It’s time to address online degradation of women, girls
Alarms are blaring about artificial intelligence deepfakes that manipulate voters, like the robocall sounding like President Joe Biden that went to New Hampshire households, or the fake video of Taylor Swift endorsing Donald Trump.
Yet there’s a far bigger problem with deepfakes that we haven’t paid enough attention to: deepfake nude videos and photos that humiliate celebrities and unknown children alike. One recent study found that 98% of deepfake videos online were pornographic and that 99% of those targeted were women or girls.
Francesca Mani, a 14-year-old high school sophomore in New Jersey, told me she was in class in October when the loudspeaker summoned her to the school office. There the assistant principal and a counselor told her that one or more male classmates had used a “nudify” program to take a clothed picture of her and generate a fake naked image. The boys had made images of a number of other sophomore girls.
Fighting tears, feeling violated and humiliated, Francesca stumbled back to class. In the hallway, she said, she passed another group of girls crying for the same reason — and a cluster of boys mocking them.
Francesca started a website about the problem — aiheeelp.com — and met state legislators and members of Congress to call attention to the issue.
With just a single image of a person’s face, it is possible to make a 60-second sex video of that person’s likeness in just 30 minutes. That video can be posted on pornographic websites or on specialized deepfake sites.
In addition, there are “nudify” or “undressing” websites and apps of the kind that targeted Francesca. “Undress on a click!” one urges. These overwhelmingly target women and girls; some are not even capable of generating a naked male. A British study of child sexual images produced by AI found 99.6% were of girls, most commonly between ages 7 and 13.
When Francesca was targeted, her family consulted police and lawyers but found no remedy. “The police say, ‘Sorry, we can’t do anything,’” said her mother.
The problem is that there isn’t a law that has been clearly broken. “We just continue to be unable to have a legal framework that can be nimble enough to address the tech,” said Yiota Souras, chief legal officer for the National Center for Missing & Exploited Children.
The impunity reflects a blase attitude toward the humiliation of victims. One survey found that
74% of deepfake pornography users reported not feeling guilty about watching the videos.
We have a hard-fought consensus established today that unwanted kissing, groping and demeaning comments are unacceptable, so how is this form of violation given a pass? How can we care so little about protecting women and girls from online degradation?
In rare cases, deepfakes have targeted boys, often for “sextortion,” in which a predator threatens to disseminate embarrassing images unless the victim pays money or provides nudes. As I see it, Google and other search engines are recklessly directing traffic to porn sites with nonconsensual deepfakes. Google is essential to the business model of these malicious companies.
In one search I did on Google, seven of the top 10 video results were explicit sex videos involving female celebrities. Using the same search terms on Microsoft’s Bing search engine, all 10 were. But this isn’t inevitable: At Yahoo, none were.
The greatest obstacles to regulating deepfakes, I’ve come to believe, aren’t technical or legal — although those are real — but simply our collective complacency.
Society was also once complacent about domestic violence and sexual harassment. In recent decades, we’ve gained empathy for victims and built systems of accountability that, while imperfect, have fostered a more civilized society.
It’s time for similar accountability in the digital space. New technologies are arriving, yes, but we needn’t bow to them. Instead, we should stand with victims and crack down on deepfakes that allow companies to profit from sexual degradation, humiliation and misogyny.