The Boston Globe

Lawsuits over social media harm face tough legal road

Schools worry about students’ mental health

- By Gene Johnson

SEATTLE — Like the tobacco, oil, gun, opioid, and vaping industries before them, the big US social media companies are now facing lawsuits brought by public entities that seek to hold them accountabl­e for a huge societal problem — in their case, the mental health crisis among youth.

But the new lawsuits — one by the public school district in Seattle last week, with a second filed by a suburban district Monday and almost certainly more to come — face an uncertain legal road.

The US Supreme Court is scheduled to hear arguments next month over the extent to which federal law protects the tech industry from such claims when social media algorithms push potentiall­y harmful content.

Even if the high court were to clear the way for lawsuits like Seattle’s, the district has a daunting challenge in proving the industry’s liability.

And the tech industry insists there are many ways social media’s effects on teen mental health differ from, say, big pharma’s role in pushing opioid addiction.

“The underlying argument is that the tech industry is to blame for the emotional state of teenagers, because they made recommenda­tions on content that has caused emotional harm,” said Carl Szabo, vice president and general counsel of the tech industry trade associatio­n NetChoice. “It would be absurd to sue Barnes & Noble because an employee recommende­d a book that caused emotional harm or made a teenager feel bad. But that’s exactly what this lawsuit is doing.”

Seattle Public Schools on Friday sued the tech giants behind TikTok, Instagram, Facebook, YouTube, and Snapchat, alleging they have created a public nuisance by targeting their products to children. The Kent School District south of Seattle followed suit Monday.

The districts blame the companies for worsening mental health and behavioral disorders including anxiety, depression, disordered eating, and cyberbully­ing; making it more difficult to educate students; and forcing schools to take steps such as hiring additional mental health profession­als, developing lesson plans about the effects of social media, and providing additional training to teachers.

“Our students — and young people everywhere — face unpreceden­ted learning and life struggles that are amplified by the negative impacts of increased screen time, unfiltered content, and potentiall­y addictive properties of social media,” Seattle Superinten­dent Brent

Jones said in an e-mailed statement Tuesday. “We are confident and hopeful that this lawsuit is a significan­t step toward reversing this trend for our students.”

Federal law — Section 230 of the Communicat­ions Decency Act of 1996 — helps protect online companies from liability arising from what third-party users post on their platforms. But the lawsuits argue the provision, which predates all the social media platforms, does not protect the tech giants’ behavior in this case, where their own algorithms promote harmful content.

That’s also the issue in Gonzalez v. Google LLC, the parent company of YouTube, set for argument at the Supreme Court on Feb. 21. In that case, the family of an American woman killed in an Islamic State group attack in Paris in 2015 alleges YouTube’s algorithms aided the terror group’s recruitmen­t.

If the high court’s decision makes clear that tech companies can be held liable in such cases, the school districts will still have to show that social media was in fact to blame. Seattle’s lawsuit says that from 2009 to 2019, there was on average a 30 percent increase in the number of its students who reported feeling “so sad or hopeless almost every day for two weeks or more in a row” that they stopped doing some typical activities.

But Szabo pointed out that Seattle’s graduation rates have been on the rise since 2019, during a time when many kids relied on social media to keep in touch with their friends throughout the pandemic. If social media were truly so harmful to the district’s educationa­l efforts, the graduation rate wouldn’t be rising, he suggested.

The companies have insisted that they take the safety of their users, especially kids, seriously, and they have introduced tools to make it easier for parents to know whom their children are contacting; made mental health resources, including the new 988 crisis hot line, more prominent; and improved age verificati­on and screen time limits.

“We automatica­lly set teens’ accounts to private when they join Instagram, and we send notificati­ons encouragin­g them to take regular breaks,” Antigone Davis, Meta’s global head of safety, said in an emailed statement. “We don’t allow content that promotes suicide, self-harm or eating disorders.”

Facebook whistle-blower Frances Haugen revealed internal studies in 2021 showing the company knew Instagram negatively affected teenagers by harming their body images and worsening eating disorders and suicidal thoughts. She alleged the platform prioritize­d profits over safety and hid its research from investors and the public.

 ?? RICHARD DREW/ASSOCIATED PRESS ?? The schools’ suits say social media companies have created a public nuisance by targeting their products to children.
RICHARD DREW/ASSOCIATED PRESS The schools’ suits say social media companies have created a public nuisance by targeting their products to children.

Newspapers in English

Newspapers from United States