Instagram’s algorithms are pushing pro-eating disorder content to millions of users, many of whom are minors as young as nine and 10 years old, according to a new report.
The report, released Thursday by the children’s advocacy group FairPlay, found up to 20 million users are being fed content by just 90,000 accounts that promote restrictive diets and extreme weight loss. About one-third of those accounts are run by underage users.
“This is a world that Instagram creates by recommending who to follow and recommending people follow each other,” said Rhys Farthing, the report’s lead author who works as the data policy director at online safety advocate Reset Australia.
“What struck me the most is that this algorithm was created by humans. It didn’t arise fully formed like Aphrodite. And it would be so simple to just change the algorithm and pop this bubble … but we’re not seeing that happen, and it’s incredibly harmful to young people.”
According to the report, Instagram’s parent company Meta derives an estimated $2 million in annual revenue from pro-eating disorder accounts, and nearly $228 million from its followers, whose average age is 19 years old.
Farthing and her research team identified 153 public “seed accounts” that each have over 1,000 followers and explicitly advocate for restrictive diets and extreme weight loss regimens.
The researchers calculated that approximately 1.6 million Instagram users followed at least one of those seed accounts, including 88,600 who followed three or more. At least one of those accounts were followed by up to 20 million users, many of whom were prompted to follow due to a mutual connection.
Test accounts set up by Farthing’s team would receive several recommendations for following pro-eating disorder accounts simply by expressing an interest in such content. Those accounts would also see their follower counts skyrocket despite being inactive for weeks.
“It was so incredibly easy to identify this community,” Farthing told Global News. “We used a super simple technique, and it wasn’t hard at all to figure out this bubble.
“It also wasn’t hard for us to find how many of these people were under 18 and, more terrifyingly, how many said they were under 13.”
The report includes a first-hand account from a 17-year-old user names Kelsey who says she used social media to fuel her obsession with weight loss. She says she only managed to clear pro-eating disorder content from her feed after recovering from an eating disorder herself and actively telling Instagram to stop promoting those accounts.
“I felt like my feed was always pushed towards this sort of content from the moment I opened my account,” Kelsey says.
Years of concerns about the impact Instagram has on young women and girls came to a head last year, when Facebook whistleblower Frances Haugan released an internal report that showed Instagram makes teen girls feel worse about their bodies.
Global News subsequently spoke to women who shared their own struggles with self-image and mental health, which they said was made worse by the pressures of Instagram and other social media apps.
Research on the issue goes back even further. A 2019 peer-reviewed study by York University researchers published in the Body Image journal showed that young adult women who actively engaged with the social media of attractive peers experienced worsened body image.
Meta spokesperson Alex Kucharski said in a statement that “reports like this often misunderstand that completely removing content related to peoples’ journeys with or recovery from eating disorders can exacerbate difficult moments and cut people off from community.”
“Experts and safety organizations have told us it’s important to strike a balance and allow people to share their personal stories while removing any content that encourages or promotes eating disorders.”
In the wake of Haugan coming forward, Instagram has promoted new tools it says help users, particularly minors, manage their use of the app. The tools include a “Take a Break” notification that reminds users to disengage after a determined length of time.
But Farthing says her research proves putting the onus on users to protect themselves from harmful content is the wrong approach.
“What we need is a complete pivot,” she said. “You still have these companies that are optimizing their social media platforms for maximum engagement.
“These attempts to give users back control completely miss the fact that they have built these platforms in ways that are absolutely harmful and risky.”
Farthing says she’s optimistic that legislation is being introduced in several countries — including the U.K.’s Age Appropriate Design Code and the Kids Online Safety Act in the U.S. — that seek to ensure social media companies consider children’s “best interests” when designing its services, including algorithms.
California is considering its own version of such a law, modelled after the U.K. legislation.
While she believes “the tide has turned” with more decision-makers taking research like hers seriously, Farthing says she’s concerned about the current generation that has already been harmed.
“All of these young people have had this massive experience of the digital world,” she said. “It’s a huge part of growing up now. But this huge part of growing up has happened in a space that was not designed for them.”
— with files from Saba Aziz and Leslie Young