Los Angeles Times

It’s time to set up guardrails on kids’ social media use

Big tech companies refuse to make their platforms safer for children. It’s up to lawmakers to do it now.

-

From state capitols to Washington, D.C., lawmakers are scrambling to come up with regulation­s that can protect kids from the potential harms of social media, since the platforms have been unwilling to adopt reasonable safeguards themselves.

In just the last few months, Florida passed a law banning children under age 14 from having a social media account, Iowa legislator­s backed a bill that would require children younger than 18 to get parental permission to set up and use a social media account, and Colorado legislator­s passed a bill that would require platforms to display pop-up warnings on kids’ accounts after an hour of use.

A dozen other states, including California, are considerin­g or have passed laws that would force companies to design their platforms to be safer for kids. Changes could include stricter privacy settings, limiting data collection and targeted ads, and removing features that encourage kids to stay online longer, such as infinite scroll and autoplay, which automatica­lly launches a new video when one ends.

Congress is also working on bipartisan legislatio­n with similar measures to require social media companies to enact safeguards to protect children.

This legislatio­n is driven by a growing understand­ing that social media apps can be addictive and are dangerous to children’s mental health. The American Psychologi­cal Assn. urged again this month that policymake­rs require that tech companies reduce the risks embedded in the platforms.

Yet the drive for regulation is facing stiff pushback from the tech industry, which has lobbied against the bills and filed lawsuits to block new legislatio­n from taking effect, arguing the laws are unconstitu­tional. California’s first-in-the-nation law to require that social media platforms be designed to protect children was blocked in the fall by a federal judge who said the law probably violates the 1st Amendment rights of the tech companies that it seeks to regulate.

With California’s first attempt held up in court, lawmakers are trying again this year.

Senate Bill 976 by Sen. Nancy Skinner (D-Berkeley) would require that social media platforms essentiall­y turn off their algorithms for users younger than 18 and instead serve them content through a chronologi­cal feed from people they follow and informatio­n that they’ve searched for.

The algorithms are designed to feed users a steady stream of content they didn’t necessaril­y ask for that keeps them on the app, which is why the algorithms have been called addictive. That content may be more dangerous or extreme than what the user initially searched for or not age-appropriat­e.

The bill is sponsored by Atty. Gen. Rob Bonta, who sued Meta last year alleging the company used harmful and “psychologi­cally manipulati­ve product features,” such as “likes,” infinite scroll and constant alerts, to hook young people on Instagram and

Facebook and keep them engaged for as much time as possible in order to boost profits.

SB 976 attempts to curtail some of those features. It would bar platforms from sending notificati­ons to children between 12 and 6 a.m. and require that social media platforms give parents the ability to change the settings on their kids’ accounts, such as turning off notificati­ons and setting time limits on usage. Parents could allow kids to opt in to the algorithmi­c feed or turn off restrictio­ns.

These are reasonable safeguards and much less restrictiv­e than proposals in other states, yet tech industry groups have opposed the bill. They argue that a chronologi­cal feed is no safer for children than an algorithmi­c feed; bad actors could flood a chronologi­cal feed with low quality or harmful content that bury posts from family or friends. They also said the bill will run up against the same 1st Amendment challenges as other laws because, along with limiting a minor’s ability to access and share informatio­n, it would impede adults’ access to lawful content because adults would have to prove their age to use the less restricted algorithmi­c feed and settings.

It’s likely that any law attempting to put guardrails on social media platforms will face legal challenges. This is complex legal and regulatory terrain, but that’s exactly why California lawmakers should keep pushing ahead with SB 976 and similar efforts. The tech industry has been unwilling to voluntaril­y change its practices to protect children. Lawmakers have to do it for them.

 ?? NurPhoto ?? STATE and federal lawmakers are working on regulation­s designed to protect kids on social media, which can be addictive and harmful to their mental health.
NurPhoto STATE and federal lawmakers are working on regulation­s designed to protect kids on social media, which can be addictive and harmful to their mental health.

Newspapers in English

Newspapers from United States