Dayton Daily News

It’s time to address online degradatio­n of women, girls

- Nicholas Kristof is a columnist for The New York Times.

Alarms are blaring about artificial intelligen­ce deepfakes that manipulate voters, like the robocall sounding like President Joe Biden that went to New Hampshire households, or the fake video of Taylor Swift endorsing Donald Trump.

Yet there’s a far bigger problem with deepfakes that we haven’t paid enough attention to: deepfake nude videos and photos that humiliate celebritie­s and unknown children alike. One recent study found that 98% of deepfake videos online were pornograph­ic and that 99% of those targeted were women or girls.

Francesca Mani, a 14-year-old high school sophomore in New Jersey, told me she was in class in October when the loudspeake­r summoned her to the school office. There the assistant principal and a counselor told her that one or more male classmates had used a “nudify” program to take a clothed picture of her and generate a fake naked image. The boys had made images of a number of other sophomore girls.

Fighting tears, feeling violated and humiliated, Francesca stumbled back to class. In the hallway, she said, she passed another group of girls crying for the same reason — and a cluster of boys mocking them.

Francesca started a website about the problem — aiheeelp.com — and met state legislator­s and members of Congress to call attention to the issue.

With just a single image of a person’s face, it is possible to make a 60-second sex video of that person’s likeness in just 30 minutes. That video can be posted on pornograph­ic websites or on specialize­d deepfake sites.

In addition, there are “nudify” or “undressing” websites and apps of the kind that targeted Francesca. “Undress on a click!” one urges. These overwhelmi­ngly target women and girls; some are not even capable of generating a naked male. A British study of child sexual images produced by AI found 99.6% were of girls, most commonly between ages 7 and 13.

When Francesca was targeted, her family consulted police and lawyers but found no remedy. “The police say, ‘Sorry, we can’t do anything,’” said her mother.

The problem is that there isn’t a law that has been clearly broken. “We just continue to be unable to have a legal framework that can be nimble enough to address the tech,” said Yiota Souras, chief legal officer for the National Center for Missing & Exploited Children.

The impunity reflects a blase attitude toward the humiliatio­n of victims. One survey found that

74% of deepfake pornograph­y users reported not feeling guilty about watching the videos.

We have a hard-fought consensus establishe­d today that unwanted kissing, groping and demeaning comments are unacceptab­le, so how is this form of violation given a pass? How can we care so little about protecting women and girls from online degradatio­n?

In rare cases, deepfakes have targeted boys, often for “sextortion,” in which a predator threatens to disseminat­e embarrassi­ng images unless the victim pays money or provides nudes. As I see it, Google and other search engines are recklessly directing traffic to porn sites with nonconsens­ual deepfakes. Google is essential to the business model of these malicious companies.

In one search I did on Google, seven of the top 10 video results were explicit sex videos involving female celebritie­s. Using the same search terms on Microsoft’s Bing search engine, all 10 were. But this isn’t inevitable: At Yahoo, none were.

The greatest obstacles to regulating deepfakes, I’ve come to believe, aren’t technical or legal — although those are real — but simply our collective complacenc­y.

Society was also once complacent about domestic violence and sexual harassment. In recent decades, we’ve gained empathy for victims and built systems of accountabi­lity that, while imperfect, have fostered a more civilized society.

It’s time for similar accountabi­lity in the digital space. New technologi­es are arriving, yes, but we needn’t bow to them. Instead, we should stand with victims and crack down on deepfakes that allow companies to profit from sexual degradatio­n, humiliatio­n and misogyny.

 ?? ?? Nicholas Kristof
Nicholas Kristof

Newspapers in English

Newspapers from United States