Social media is hard enough on consenting adults. It’s no place for kids
• The star-studded milieu of Instagram seems finely tuned to destabilise the teenage mind
Social media is a minefield of adolescent anxieties, as any parent can attest. Numerous studies have suggested a connection between excessive use of online platforms (and the devices used to access them) and worrying trends in teenage mental health, including higher rates of depressive symptoms, reduced happiness and an increase in suicidal thoughts.
Even in this grim context, Instagram, the wildly popular photo-sharing app owned by Facebook, stands out. Its starstudded milieu glossy, hedonistic, sexualised seems finely tuned to destabilise the teenage mind. Studies have linked the service to eating disorders, reduced self-esteem and more.
So perhaps it isn’t surprising that an internal research effort at the company found that teens associate the service with a host of mental-health problems.
“Thirty-two percent of teen girls said that when they felt bad about their bodies, Instagram made them feel worse,” said one slide. “Teens blame Instagram for increases in anxiety and depression,” said another. “This reaction was unprompted and consistent across all groups.”
If Facebook was concerned about these findings before they became public, it didn’t do much. In July, Instagram rolled out policy changes intended to protect teens, such as limiting how advertisers can target them and setting their accounts to private by default.
“Instagram has been on a journey to really think thoughtfully about the experience that young people have,” a company rep said at the time.
Unfortunately, all that thoughtful thinking yielded an incoherent result. In the same post in which Facebook announced the changes, it conceded that it was moving ahead with a new version of Instagram intended for children under 13. Dubbed Instagram Youth, the concept was so distasteful that it earned the opprobrium of health experts and consumer advocates, legislators of both parties in the US, and nearly every state attorney-general in the country.
A letter from health experts could hardly have been blunter. “The platform’s relentless focus on appearance, self-presentation, and branding presents challenges to adolescents’ privacy and wellbeing,” it said. “Younger children are even less equipped to deal with these challenges, as they are learning to navigate social interactions, friendships, and their inner sense of strengths and challenges during this crucial window of development.”
Facebook justifies this plan on the theory that, since it has largely failed to keep children off of adult Instagram, the kids’ version will “reduce the incentive for people under the age of 13 to lie about their age”.
One might ascribe all this to Facebook’s standard-issue tactlessness. Yet its treatment of young people has been especially irresponsible. For years, it refused to make changes that would prevent children from running up credit-card bills on its platform.
In 2016, it started paying young people — including minors — $20 a month to use an app that gave the company total access to their web and phone activity. Its Messenger Kids app is targeted at users as young as six, even though experts have warned that it’s likely to “undermine children’s healthy development”. That these schemes keep going horribly awry doesn’t seem to be much of a deterrent.
One wonders what would be. As a start, legislators should put pressure on Facebook to scrap Instagram Youth and make an earnest effort to protect teenagers across its services. Congress should extend existing online protections for children to all users up to age 15 and create a legal expectation that platforms do more to prevent minors from lying about their ages.
Down the road, more stringent regulations — perhaps modelled on the UK’s ageappropriate design code — may be needed if platform companies refuse to take this problem more seriously.