The Daily Telegraph

Social networks cannot shirk duty of care

It really isn’t beyond the ingenuity of social media companies to protect children online

- WILLIAM PERRIN William Perrin is a former adviser to Tony Blair and a trustee of the Carnegie UK Trust

When bringing up children in Britain we are lucky to be able to take so many things for granted. We can justly assume that the equipment in a playground is safe, that the water in a swimming pool has been tested, that the staff at a school are qualified and vetted. Our children can still take risks – which are essential to healthy growth – but the risks can be known and managed. This situation did not happen by accident: it is the result of judicious regulation.

But in the online world we cannot make these assumption­s. We cannot manage the risks, which are more complex, sometimes unknown, often invisible to even the most attentive parent. That is why I agree with this newspaper’s call to place social media companies under a duty of care towards children who use their services, backed by a competent regulator like Ofcom – which, as a civil servant, I helped create.

How would this work? As the internet law expert Prof Lorna Woods and I have set out, the advantage of a statutory duty of care is that it does not seek to strangle companies by specifying exactly how they must manage risk. It simply requires them to reduce harm and demonstrat­e to the regulator how well they are doing that. Regulated companies that failed to do this would be, for example, fined, forced to put warning notices on their products and, in the most extreme cases, possibly banned or blocked by internet service providers.

There are two basic steps social networks would need to take to begin to demonstrat­e a duty of care towards children. First they would need to establish how old the child is. Agecheckin­g is managed reasonably well by pubs, clubs, banks, bookies and theme parks. It’s trickier online, but should not be beyond the skills of social media companies. In the ongoing debate about preventing access to online pornograph­y by under-18s, one proposal is to ask users to buy a voucher from a newsagent – reliable enough age verificati­on, but without creating a huge database of porn users’ passports. Some firms might try to dodge the problem by setting a higher age limit, but a regulator would see through that quickly and require them to prove they are properly checking.

The second step companies would have to take is to deliver tools for parents to control their children’s usage. Some services, remarkably, don’t have any – recklessly assuming that a 13-year-old is sophistica­ted enough to manage all that social media can throw at them. Under a duty of care to deliver such tools, we might see a common standard emerge: for instance, services for under-16s could have controls set to maximum by default, and parents could change them one at a time. We can already see hints of this as Youtube prepares to let parents hand-pick the videos their children can see.

After taking these steps, companies under a duty of care would need to demonstrat­e they are reducing harm within their services. A regulator could ask them to divulge how their services control children’s behaviours and research independen­tly any harm caused. The charity group 5Rights, which says “persuasive design” is driving children to compulsive use, recommends that all alerts, beeps and reminders are set by default to off, that apps offer time-outs and that it is as easy to log out as it is to log in without being pestered. It wants companies to inform people when they use these design features to drive behaviour. Apple is about to offer time monitoring and limits in its phones even though it does not provide any social media services. If Apple can do it, others can.

Companies would also need to shield children from abusive conversati­ons, whether with other children or with adults. The tools are there to tackle it effectivel­y; recently Twitter has finally begun proactivel­y blocking accounts that seem to be a nuisance, while Twitch, the online gaming network, will now consider banning you from its platform for actions taken elsewhere.

These are all tough problems, but social media companies are ingenious and capable of managing them. They simply don’t face sufficient incentives right now. They often say they want to use their services to nudge us into better behaviour. Let’s return the favour, and pass laws that help them take responsibi­lity for themselves.

 ??  ??

Newspapers in English

Newspapers from United Kingdom