EU law forces social media firms to remove child abuse
ALL social media companies will be required by law to proactively remove child abuse content, including private encrypted messaging, under new duty of care laws.
EU legislation published yesterday has gone further than the UK in compelling social media platforms to use proactive technology, such as photo DNA scanning, to find abuse and remove it.
Companies that fail to do so face fines of up to 6 per cent of their global turnover, and could have their services closed down for at least four weeks if persistent breaches are found.
Companies, including Apple, Google and Meta, that run app stores will also for the first time be required to vet the apps for child abuse risks. They will have to use age assurance measures to stop children from downloading apps if they could expose them to a “high risk” of grooming.
Campaigners, including the NSPCC, see the moves as significant because the companies will have no way to escape legislation.
The US federal government and individual states are also preparing to introduce similar online safety laws.
The EU rules are likely to benefit UK users as the companies are unlikely to drop standards for Britain alone. Andy Burrows, head of child safety online at the NSPCC, said: “This is an impressively bold and ambitious proposal to systemically prevent avoidable child abuse and grooming, which is taking place at record levels.
“If approved, it will place a clear requirement on platforms to combat abuse wherever it takes place, including in private messaging, where children are at greatest risk.
“Putting a duty on app stores to identify children and prevent them from downloading apps where there is a high risk of grooming will focus company minds on ensuring the problem is tackled on their platforms.
“This groundbreaking proposal could set the standard for regulation that balances the fundamental rights of all internet users while prioritising child protection.”
As it stands, the UK’S Online Safety Bill will not allow Ofcom to create a code of practice that requires companies to use proactive technology to find abuse in private messaging.
However, the UK is proposing tougher penalties for social media firms that breach their duty of care, including fines worth up to 10 per cent of their global turnover, blocks on access to UK users and criminal prosecution if they fail to cooperate with Ofcom.