Senators grill social media execs on children’s safety
Blumenthal to YouTube, TikTok, Snap: ‘For Big Tech a Big Tobacco moment’
WASHINGTON — Senators put executives from YouTube, TikTok and Snapchat on the defensive Tuesday, questioning them about what they’re doing to ensure young users’ safety on their platforms.
Citing the harm that can come to vulnerable young people from the sites — ranging from eating disorders to exposure to sexually explicit content and material promoting addictive drugs — the lawmakers also sought the executives’ support for legislation bolstering protection of children on social media. But they received little firm commitment.
“The problem is clear: Big Tech preys on children and teens to make more money,” Sen. Edward Markey, D-Mass., said at a hearing by the Senate Commerce subcommittee on consumer protection.
The panel recently took testimony from a former Facebook data scientist, who laid out internal company research showing that the company’s Instagram photo-sharing service appears to harm some teens.
“We’re hearing the same stories of harm” caused by YouTube, TikTok and Snapchat, said Sen. Richard Blumenthal, D-Conn., the panel’s chairman.
“This is for Big Tech a big tobacco moment ... It is a moment of reckoning,” he said. “There will be accountability. This time is different.”
Markey asked the executives — Michael Beckerman, a TikTok vice president and head of public policy for the Americas; Leslie Miller, vice president for government affairs and public policy of YouTube’s owner Google; and Jennifer Stout, vice president for global public policy of Snapchat parent Snap Inc. — if they would support his bipartisan legislation that would give new privacy rights to children, and ban targeted ads and video autoplay for kids.
Markey tried to draw out a commitment of support, but the executives avoided providing a direct endorsement, insisting that their platforms are complying with the proposed restrictions. They said they’re seeking a dialogue with lawmakers as the legislation is crafted.
That wasn’t good enough for Markey and Blumenthal, who perceived a classic Washington lobbying game.
“This is the talk that we’ve seen again and again and again and again,” Blumenthal told them. Applauding legislative goals in a general way is “meaningless” unless backed up by specific support, he said.
“Sex and drugs are violations of our community standards; they have no place on TikTok,” Beckerman said.
This year after federal regulators ordered TikTok to disclose how its practices affect children and teenagers, the platform tightened its privacy practices for users under 18.Stout made the case that Snapchat’s platform relies on humans, not artificial intelligence, for moderating content.
Miller said YouTube has worked to provide children and families with protections and parental controls like time limits, to limit viewing to age-appropriate content.
“We do not prioritize profits over safety. We do not wait to act,” she said.