The Daily Telegraph

How the White Paper has all tech firms, from Facebook to Fortnite, in its sights

Social media, games and messaging apps face tough sanctions drawn up to protect users from harm

- By Charles Hymas HOME AFFAIRS EDITOR and Mike Wright SOCIAL MEDIA CORRESPOND­ENT

TENS of thousands of tech companies are to be subject to a statutory duty of care to protect their users. The new rules will cover media giants such as Facebook, search engines like Google, interactiv­e games such as Fortnite and messaging apps and online forums such as Snapchat and Tripadviso­r.

A White Paper, published today, says any tech company that allows the sharing of content or user interactio­n will be bound by the duty of care and face draconian sanctions if they breach it. The sanctions range from fines based on the company’s size, to criminal prosecutio­ns of directors, removal of their sites from search engines and a block on access to users. The regulator The duty of care, which requires firms to take “all reasonable steps” to keep users safe and tackle illegal and harmful content, will be policed by an independen­t regulator.

The regulator will be funded by the tech industry in the medium term, with the prospect of a levy on firms to ensure long-term sustainabi­lity.

The regulator will be able to demand informatio­n from the companies, including details of controvers­ial algorithms that have been blamed for driving content to users, as is thought to have happened with Molly Russell, 14, who took her own life after viewing self-harm material on Instagram.

It will also draw up legally enforced codes of practice spelling out what tech companies must do to protect users from harms such as terrorist content, child sexual abuse, illegal drugs or weapons sales, cyber bullying, self-harm, harassment, disinforma­tion, violence and pornograph­y. Enforcemen­t The regulator will serve notices on errant firms to force them to put right breaches. It will also have “name and shame” powers, requiring companies to publish public notices on their sites about how they have failed to comply with standards. Every company will have to produce annual reports for the regulator on the scale of harms and what they are doing to combat them.

The fines are expected to mirror those for data protection laws, which are up to 4 per cent of global turnover. Ministers are consulting on additional powers, including disrupting businesses by removing them from search results, app stores and links.

There could be civil or criminal liability for breaches on senior staff and, as a last resort, the power to require Internet Service Providers to block firms’ access to UK users. Codes of Practice Each code, covering three areas of concern, says the firms’ platforms need to be “safe by design”, explaining clearly to users what is harmful content, how they can complain, response times and what action they can take, such as blocking, muting or staying hidden from other users.

1. Terror and child abuse: The regulator will decide the time frame within which platforms should remove content, such as the live-streamed New Zealand mosque massacre, and require them to identify and act on paedophile­s’ grooming of children. NSPCC statistics show there were 5,161 offences of grooming children online in 18 months. Breck Bednar, 14, was murdered after being groomed by a man he met via an online forum.

The two codes – which have to be signed off by the Home Secretary – will require tech firms to develop technology to stop terrorist or child abuse content getting online, and to prevent potential extremists or paedophile­s from searching their sites.

Although Facebook says it took down 14 million pieces of terrorist content last year, the Government says the Islamic State of Iraq and the Levant used more than 100 platforms to spread terrorist propaganda.

2. Disinforma­tion: Firms will be required to have dedicated factchecki­ng, especially during elections, to promote authoritat­ive news sources. Platforms will be required to ensure their algorithms do not drive “extreme and unreliable” informatio­n to users to keep them online, which boosts advertisin­g revenues.

3. Self-harm and suicide: Algorithms that promote self-harm or suicide to users will be banned, while firms will have to remove “illegal” or “unacceptab­le” content “rapidly”. Other harms include hate crime, harassment and serious violence.

Firms could face sanctions if they lack robust age verificati­on. It will mean millions of children being removed or barred

The regulator will have ‘name and shame’ powers, requiring companies to publish public notices about how they have failed to comply with standards

Age verificati­on Tech firms will be under a legal duty to protect children from inappropri­ate content and to enforce their terms and conditions, which set out age limits. This means they could face sanctions if they lack robust age verificati­on. With up to 40 per cent of under-13s on social media, it will mean millions of children being removed or barred.

Platforms may have to impose different settings for children and filters to block inappropri­ate content. Complaints Users will have a legal right to an “effective and easy-to-access” complaints function with a set timescale for responses and an internal appeals system. It will be backed by either an independen­t redress or super-complaints system.

 ??  ?? Molly Russell, 14, who is thought to have taken her own life after viewing disturbing material on Instagram
Molly Russell, 14, who is thought to have taken her own life after viewing disturbing material on Instagram
 ??  ?? Breck Bednar, 14, was murdered by a predator who groomed him online in 2014
Breck Bednar, 14, was murdered by a predator who groomed him online in 2014

Newspapers in English

Newspapers from United Kingdom