TikTok: ‘We remove 80m underage prof iles a year’
TIKTOK removes 80million accounts of underage users a year, an Oireachtas committee has been told.
The revelation came during a Children’s Committee hearing on child protection in the context of artificial intelligence.
The platform is designed for people aged 13 and over.
Representatives from Meta, TikTok, and X (formerly Twitter) were told by Irish parliamentarians that ‘social media is a cesspit’ and their companies were not doing enough to protect our children.
One of the issues discussed at committee was age verification of users on apps to protect youngsters. Meta’s head of public policy in Ireland, Dualta Ó Broin, suggested verification could be done at App Store level, taking the burden off individual apps – particularly newer companies that see rapid rises in users.
‘That would be a step forward,’ he said. ‘It would be a resolution of the age-verification question. We would still have huge responsibilities to ensure that all of these users are then placed into an age-appropriate experience.’
He said that other solutions included the process being done by telecommunications companies or by device.
The social media giant, which owns Facebook, WhatsApp and Instagram, said it dismantled 27 abusive networks and banned almost half a million accounts for child safety violations between 2020 and 2022.
Fine Gael senator Mary Seery Kearney raised concern about social media platforms’ ‘deliberate manipulation’ of users and the resultant ‘behaviour modification’.
Ms Seery Kearney said she wanted to see more time limits on app use, adding: ‘Social media needs to come with a mental health warning.’
TikTok’s public policy lead for child safety, Chloe Setter, said she ‘totally appreciates’ the senator’s concerns, but added there is no agreement among experts on what amount of time is considered ‘good’.
She said TikTok had take-abreak reminders, usage limits and push alert cut-offs associated with age.
Meta’s director of safety policy, David Miles, told the politicians their concerns were justified and the company was working with safety experts.
Susan Moss, head of public policy at TikTok, replied: ‘I agree with you. Schools are a place for education. They’re not a place for smartphones and the internet.’
Claire Dile, X’s director for government affairs in Europe, said the company could make more effort to detect and remove harmful content as quickly as possible.
She said the company had launched a new moderation centre and had begun to rehire moderators as part of a variety of enforcement actions, including AI processes.
The committee was told that more over wo million people use the platform here every month.
‘Mental health warning’