‘Insta’nt pain – & profit
Meta chooses extra revenue over teen health
INSTAGRAM has devastated youth mental health — and, according to newly unsealed court filings, its parent company Meta knew exactly what it was doing. A lawsuit brought by 42 state attorneys general alleges Meta was aware of Instagram users as young as 6, realized there was rampant self-harm content on the platform, and knew the deleterious effects its platform had on teen self-image.
In fact, the company has even put a price on the head of every teen user: $270 in “lifetime” profits as long as they stay on the platform, according to the lawsuit.
Teens a $4B market
The evidence cited in the lawsuit makes clear that my generation wasn’t just collateral damage in a tech innovation.
Rather, we were targeted and manipulated for profit — and sent spiraling into a mental health crisis by greedy Big Tech giants.
The lawsuit suggests that Meta was well aware that its platform is addictive for teens, who their researchers say are “insatiable when it comes to ‘feel good’ dopamine effects.”
Instagram has a “hold on the serendipitous aspect of discovery,” an internal 2020 presentation explained, and “every time one of our teen users finds something unexpected their brains deliver them a dopamine hit.”
Even though the company claims users must be 13 to sign up for Instagram, Meta’s own survey data allegedly found that 22% of children aged 6 to 9 and 35% of those aged 10 to 12 had used the platform.
But they’ve scrambled to hide the evidence.
Polling data from underage users was apparently obscured by a Meta researcher who said she was “concerned about risks of disclosure since they aren’t supposed to be on IG at all.”
The state AGs’ lawsuit alleges the company failed to crack down on underage users because it knew just how much money was on the line. Meta researchers estimated a $270 “lifetime value” of profits from a 13-year-old user.
If the number seems low, that’s because it’s an average across the globe and American teens are 10 times more valuable.
To put that in perspective, there are an estimated 1.4 million New York teens currently active on the platform. That translates to nearly $4 billion in profit from teens in this state alone.
According to the suit, Meta also appears to have known full well that young users are being pummeled with dangerous and harmful content.
An internal investigation into eating disorder content found that girls could easily be sent down rabbit holes.
It found Instagram suggests vulnerable girls follow accounts like @skinny._.binge and @_skinandbones_.
No wonder Meta’s own data also revealed a third of teen girls say Instagram makes their body image worse, and 17% said it worsened eating issues.
Morals or money?
Meta is defending itself, of course. “We want teens to have safe, age-appropriate experiences online, and we have over 30 tools to support them and their parents,” a spokesperson told The Post.
“We’ve spent a decade working on these issues and hiring people who have dedicated their careers to keeping young people safe and supported online. The complaint mischaracterizes our work using selective quotes and cherry-picked documents,” the Meta spokesperson said.
As a solution, the company said it supports federal legislation that requires app stores to get parents’ approval whenever their teens under 16 download an app.
But, in spite of the data and their statements, efforts to relieve Instagram’s deleterious effects on self-image have allegedly been scrapped for the sake of profit.
The lawsuit alleges that Meta considered hiding “like” counts to reduce negative self-comparison — an initiative dubbed “Project Daisy.”
Even though their researchers, according to the lawsuit, found indications that it could reduce Instagram’s harmful effects on young girls, the company decided against making hidden likes a default setting because it would “perpetuate the perception that ‘likes’ are bad for young people” — and would trigger an estimated 1% decline in advertising revenue.
They made this decision in spite of the fact that their own polling data also found 8.4% of 13- to 15-year-old users were shown self-harm-related content within the past week, and 13.5% of teen girls on Instagram said it worsened thoughts of self-harm and suicide.
Would Meta rather risk 1% revenue or risk perpetuating a mental health crisis?