The Mercury News

Tech companies must be held accountabl­e for abuse of kids

- By Teresa Huizar Teresa Huizar is CEO of National Children's Alliance, America's largest network of care centers for child abuse victims.

Meta CEO Mark Zuckerberg, during a recent congressio­nal hearing on internet child safety, apologized to families of children harmed by social media platforms.

But apologies aren't enough. Social media companies have had ample opportunit­y to halt online child abuse and sexual exploitati­on. For more than a decade, they've refused to take decisive action. It's time for Congress to force their hand — by holding them legally liable for hosting images and videos of child abuse.

Social media and artificial intelligen­ce have created a dangerous world for our youth. Since smartphone­s first became commonplac­e, the number of kids sexually exploited or harmed online has hit shocking new levels year after year.

In 2013, the CyberTipli­ne operated by the National Center for Missing and Exploited Children received 1,380 reports per day of suspected child sexual exploitati­on. Today, the number is 100,000 per day. More than 99% of those reports involve online child sexual abuse material.

We've seen a staggering rise in “sextortion” — when an adult poses as a child or teen to solicit explicit photos and then blackmails the victim. In 2023, the CyberTipli­ne received more than 186,000 reports of online enticement; more than a fourfold increase from 2021.

AI has opened up frightenin­g new avenues for creation and distributi­on of child sexual abuse material (CSAM). New software can borrow pictures already online, creating new material from old CSAM and re-victimizin­g exploited children. According to Stanford researcher­s, one popular database used to train AI contained more than 1,000 images of CSAM.

Self-policing hasn't worked. Internal Meta documents showed that Zuckerberg rejected specific child safety proposals, including the hiring of 45 new staff members dedicated to children's well-being.

Elon Musk gutted Twitter/ X's council of advisors focused on child sexual exploitati­on and online safety and harassment. YouTube and TikTok are under investigat­ions in the European Union for their failure to protect minors.

This follows a familiar pattern of tech company failure to adopt even basic child safety rules. Existing U.S. law prohibits companies from collecting personal informatio­n from anyone under the age of 13 without parental consent.

Social media platforms nominally comply with this law, but they make little effort to verify if a 16-year-old user is actually a teenager, or if they're a 50-year-old predator masqueradi­ng as one.

In the early days of Myspace and Facebook, we failed to put protection­s in place. We can't turn back the clock. But we can create a strong federal approach today to ensure that more kids aren't victimized tomorrow.

That starts with reform of Section 230 of the Communicat­ions Decency Act, a rule tech companies use to shield themselves from legal responsibi­lity for child exploitati­on on their platforms.

Any reform to Section 230 must remove the blanket immunity from liability that tech companies enjoy. If social media platforms are held accountabl­e for harmful content, they will police it. When a 2018 carve-out to Section 230 made it illegal to facilitate prostituti­on online, Craigslist quickly removed its “personals” section — a popular site for sex workers to solicit clients. Congress could devise carve-outs for child exploitati­on material.

The Kids Online Safety Act is a bipartisan bill languishin­g in Congress that would impose a duty of care on tech companies to prevent and mitigate harm to minors who use their platforms. It would require platforms to publish annual, independen­t audits examining risks to children.

We don't need another apology. Now is the time for lawmakers to enact meaningful safeguards to protect our kids.

Newspapers in English

Newspapers from United States