A group of 33 states including California and New York are suing Meta Platforms Inc. for harming young people’s mental health and contributing the youth mental health crisis by knowingly designing features on Instagram and Facebook that addict children to its platforms.
The lawsuit filed in federal court in California also claims that Meta routinely collects data on children under 13 without their parents’ consent, in violation of federal law.
“Kids and teenagers are suffering from record levels of poor mental health and social media companies like Meta are to blame,” said New York Attorney General Letitia James. “Meta has profited from children’s pain by intentionally designing its platforms with manipulative features that make children addicted to their platforms while lowering their self-esteem.”
The broad-ranging suit is the result of an investigation led by a bipartisan coalition of attorneys general from California, Florida, Kentucky, Massachusetts, Nebraska, New Jersey, Tennessee, and Vermont. It follows damning newspaper reports, first by The Wall Street Journal in the fall of 2021, based on the Meta’s own research that found that the company knew about the harms Instagram can cause teenagers _ especially teen girls _ when it comes to mental health and body image issues. One internal study cited 13.5% of teen girls saying Instagram makes thoughts of suicide worse and 17% of teen girls saying it makes eating disorders worse.
Following the first reports, a consortium of news organizations, including The Associated Press, published their own findings based on leaked documents from whistleblower Frances Haugen, who has testified before Congress and a British parliamentary committee about what she found.
The use of social media among teens is nearly universal in the U.S. and many other parts of the world. Up to 95% of youth ages 13 to 17 in the U.S. report using a social media platform, with more than a third saying they use social media “almost constantly,” according to the Pew Research Center.
To comply with federal regulation, social media companies ban kids under 13 from signing up to their platforms _ but children have been shown to easily get around the bans, both with and without their parents’ consent, and many younger kids have social media accounts.
Other measures social platforms have taken to address concerns about children’s mental health are also easily circumvented. For instance, TikTok recently introduced a default 60-minute time limit for users under 18. But once the limit is reached, minors can simply enter a passcode to keep watching.
In May, U.S. Surgeon General Dr. Vivek Murthy called on tech companies, parents and caregivers to take “immediate action to protect kids now” from the harms of social media.