The National - News

CAUGHT IN THE WORLD WIDE WEB

▶ With growing concern over the safety of children online, companies such as Facebook are making tweaks, but is it enough? Rhodri Marsden investigat­es

-

While sociologis­ts busy themselves pondering over the effects of technology on humankind, parents across the globe are worrying about this on a daily basis. Grown adults can – at least in theory – look after themselves when they’re gadding about online, but who’s protecting children from the supposed evils of technology? How can we be sure of the motivation­s of the companies who make online apps and services for children? Has their privacy been properly considered? Have these services been designed for their benefit, or are they being drawn into a long-term marketing ruse? Does it put them in danger? Might it be skewing their social landscape in a way that makes them unhappy?

All these questions were raised last week when Facebook launched Messenger Kids, an app aimed at children aged between the ages of 6 and 12. Surveys indicate that millions of children under 13 are already using online services aimed at adults (including Facebook), which tend not to offer adequate protection against, say, receiving messages from strangers. Messenger Kids was built in response, in consultati­on with the National Parent Teacher Associatio­n in the United States, and Facebook has made efforts to address concerns. Only parents can approve and control the addition of contacts to their kids’ app through their Facebook account – automated filters prevent the sharing of sexual or violent content; special support teams promise to respond rapidly to any flagged content, and the GIF engine, Giphy, has been incorporat­ed into the service with all the questionab­le images removed.

Facebook had no choice but to jump through these hoops. The US federal law known as Coppa (Children’s Online Privacy Protection Act) contains strict rules regarding parental consent, and a number of companies have already been fined by the Federal Trade Commission after failing to meet standards. Indeed, so stringent are the rules that it’s easier for new services to insist that under-13s don’t use them, but Facebook has met the challenge head on. The company line is that Messenger Kids brings safe practice to a behaviour that already exists; Facebook’s David Marcus stressed to

The New York Times that the launch was part of a strategy to “solve real problems in people’s lives.” But detractors claim that it’s simply a way of instilling loyalty in children towards Facebook and, by design, forcing parents to use it too. One quirk of the system is that in order to approve your child’s friend as a contact, you have to be “friends” with that child’s parent too. This leads to further concerns surroundin­g Facebook’s accumulati­on of data to build up a picture of who your children are, what they like and, crucially, what they might want to buy.

With peer pressure such a dominant force in children’s social groups, it’s not surprising that each new, cool service is viewed by many parents as a potential threat. Video messaging apps such as Marco Polo, anonymous confession­al apps such as After School, apps like Live.ly where messages self-destruct and leave no trace – they’re all hugely popular with young people, and by their very nature, user behaviour is hard to monitor. When bullying can result from something as simple as the absence of a child from a class WhatsApp group, it’s little wonder that this multilayer­ed social media complexity invites a certain level of panic.

This complexity becomes deeper every week as the competitio­n for kids’ attention increases, and while much of it could be considered benign, some of it certainly isn’t. Back in September, more than 450 British gambling sites were ordered to remove games targeted at under-18s. Those sites believed that games where no real money is wagered could be played by children without breaking the law, but the United Kingdom’s Gambling Commission, mindful of the way gambling behaviour could be instilled in kids, disagreed.

New concerns crop up constantly. In Norway, the Consumer Council recently commission­ed an audit of the security of kids’ smart watches, and discovered a number of problems inherent in their design: after all, if parents can use smart watches to monitor their kids, lax security can allow third parties to do that too. This mirrors the recent furore over connected toys such as Cloudpets; the website Which? revealed how these toys could be hacked over Bluetooth to allow a range of operations by third parties, from simple movements to sending messages that the toy could speak to the child.

The act of pointing out these potential problems can invite excessive hysteria; after all, the likelihood of someone standing outside your home and attempting to communicat­e with your child through its toy is incredibly small. Earlier this year, a group of scientists wrote an open letter to The Guardian urging caution over the “moral panic” prompted by new technologi­cal developmen­ts and their effects on children. But there’s little doubt that more safeguards can be put in place, and particular­ly by the big companies who act as a gateway to our entertainm­ent sources and social interactio­ns.

YouTube recently recognised this when it closed 50 channels and removed content that “attempts to pass as familyfrie­ndly, but is clearly not”. Samsung has introduced Marshmallo­w, a smartphone management system that incorporat­es a reward system to encourage kids to use their phones responsibl­y. But there’s also a drive from smaller companies who recognise parents’ need for reassuranc­e. Recent apps to receive publicity for their kid-friendly stance include Jellies (a video entertainm­ent app), ReplyASAP (a messaging app for parents and kids) and Kudos (an app to help kids learn how to use social media).

For their part, children’s charities have stressed the importance of parents talking to their children about the way they’re using technology. Amanda Azeez, from the UK’s National Society for the Prevention of Cruelty to Children (NSPCC), stressed to The Guardian this summer how they “really want to help parents and carers to feel more confident and to talk to their children at least every two weeks, if not more regularly.” Perhaps, through those conversati­ons, parents might gain some insight into their own online behaviour, because we’re all capable of making mistakes. Indeed, addressing the needs of children might allow all of us to become more aware of our vulnerabil­ities when communicat­ing online.

There’s little doubt that more safeguards can be put in place, and particular­ly by the big companies who act as a gateway to our entertainm­ent sources and social interactio­ns

 ?? Getty Images ?? Various children’s charities have stressed the importance of parents regularly talking to their children about the way they’re using technology
Getty Images Various children’s charities have stressed the importance of parents regularly talking to their children about the way they’re using technology

Newspapers in English

Newspapers from United Arab Emirates