East Bay Times

Deepfake nude pics on hit list of states

Legislatur­es are on front lines of new form of sexual exploitati­on in schools

- By Natasha Singer

Caroline Mullet, a ninth grader at Issaquah High School near Seattle, went to her first homecoming dance last fall, a James Bond-themed bash with blackjack tables attended by hundreds of girls dressed up in party frocks.

A few weeks later, she and other female students learned that a male classmate was circulatin­g fake nude images of girls who had attended the dance, sexually explicit pictures that he had fabricated using an artificial intelligen­ce app designed to automatica­lly “strip” clothed photos of real girls and women.

Mullet, 15, alerted her father, Mark, a Democratic Washington state senator. Although she was not among the girls in the pictures, she asked if something could be done to help her friends, who felt “extremely uncomforta­ble” that male classmates had seen simulated nude images of them. Soon, Mark Mullet and a colleague in the state House proposed legislatio­n to prohibit the sharing of AI-generated sexuality explicit depictions of real minors.

“I hate the idea that I should have to worry about this happening again to any of my female friends, my sisters or even myself,” Caroline Mullet told state lawmakers during a hearing on the bill in January.

The state Legislatur­e passed the bill without opposition. Gov. Jay Inslee, a Democrat, signed it last month.

States are on the front lines of a rapidly spreading new form of peer sexual exploitati­on and harassment in schools. Boys across the United States have used widely available “nudificati­on” apps to surreptiti­ously concoct sexually explicit images of their female classmates and then circulated the simulated nudes via group chats on apps like Snapchat and Instagram.

Now, spurred in part by troubling accounts from teenage girls like Mullet, federal and state lawmakers are rushing to enact protection­s in an effort to keep pace with exploitati­ve AI apps.

Since early last year, at least two dozen states have introduced bills to combat AI-generated sexually explicit images — known as deepfakes — of people under 18, according to data compiled by the National Center for Missing & Exploited Children, a nonprofit organizati­on. And several states have enacted the measures.

Newspapers in English

Newspapers from United States