Meta, formerly known as Facebook, announced new tools on Tuesday it says will help teenage Instagram users limit their use of the platform while allowing parents and guardians to control their children’s experience.
The tools, which include a previously announced “Take a Break” notification feature that officially launched in Canada and other countries Tuesday, come in the wake of damning internal research released by a whistleblower earlier this year showing negative mental health impacts on teenage users, particularly girls.
“I’m proud that our platform is a place where teens can spend time with the people they care about, explore their interests and explore who they are,” said Adam Mosseri, head of Instagram, in a statement.
“I want to make sure that it stays that way, which means above all keeping them safe on Instagram.”
The Take a Break feature will send notifications after a certain amount of time reminding users to set the app aside, while also showing “expert-backed tips to help them reflect and reset,” the company says.
Users in Canada, the United States, the United Kingdom, Ireland and Australia will begin receiving suggestions to turn the feature on as part of an initial launch Tuesday. Instagram says the rollout will expand worldwide by early 2022.
Instagram also plans to launch a new feature for parents and guardians in March of next year that will allow them to not only view how much time their children are spending on the platform, but also set time limits. A new “educational hub” with resources to help adults discuss social media use with their kids is also in development, with no launch date announced yet.
Other features coming to Instagram early next year include allowing teens to inform their parents if they have reported a user for inappropriate behaviour, and switching off all users’ ability to tag or mention teens who don’t follow them.
Mosseri says Instagram is exploring further changes, including expanding its Sensitive Content Control feature that allows users to further limit their exposure to sensitive content beyond the main Explore feed. The company is also looking at ways to “nudge” users to different topics if they’ve been focused on one for too long, as well as further strengthening its age verification systems.
“We’ll continue doing research, consulting with experts and testing new concepts to better serve teens,” Mosseri said, calling the announcement a “snapshot” of its work.
Get daily National news
In September, the Wall Street Journal unveiled internal research released by former Facebook employee France Haugen that showed teenage girls who used Instagram suffered from body image issues, and blamed the platform for their increased depression and anxiety.
Haugen later testified to U.S. lawmakers that the company ignored that research in the interest of profits. Facebook — which rebranded as Meta in October — has refuted both Haugen’s claims and the internal research report.
Plans for a youth-oriented version of the platform called Instagram Kids was also put on hold after outcry from the public, youth advocates and governments worldwide, although Instagram has said it plans to go ahead with further consultation on the proposed new app.
Experts who study youth mental health say the changes announced Tuesday are long overdue after years of public pressure on Instagram, Facebook and other social media platforms to tighten their controls for teen users — though their optimism is cautious.
“My reaction is that it’s good to have anything that’s better than nothing” said Ashley Miller, a youth psychiatrist at BC Children’s Hospital in Vancouver.
“The worst thing you could say, though, is that it can give a false sense of security that everything’s OK now. I think we have to see the impact these changes have, and I would encourage more movement towards what (Instagram is) trying to do here.”
But other experts who study social media are more cynical about Instagram’s motivations.
“It’s clear these updates are in reaction to the Facebook Papers (released by Haugen) … and it reeks to me as a kind of reputation laundering,” said Eric Meyers, an associate professor in the School of Information at the University of British Columbia.
Meyers says his key concern is that the changes put the onus for changing online behaviour on the teenage users themselves, as well as their parents, rather than Instagram retooling anything about the way its design can sometimes lead to negative impacts and behaviours.
“They said they’re just going to shift your attention around, but you’re not going to make it any less engaging for you,” he said.
“It’s kind of like, you’re walking through the casino and someone who works for the casino comes up and says, ‘Hey, you’ve spent a long time at the blackjack table. Have you tried roulette?'”
Natasha Parent, a doctoral student at UBC who studies the intersection of human development and technology, has worked with young social media users. She says while she welcomes the additional tools geared toward teens, she’s concerned the parental controls will backfire on Instagram.
“(Social media) is a way to get away from your parents and connect with other kids,” Parent said. “So as soon as you’re giving parents and guardians too much access into that adolescent social world, then teens are going to want to leave the app … and go somewhere else.”
Parent has emphasized that social media can act as an overall positive force in young people’s lives despite the sometimes negative impacts, allowing them to create communities and retain social connections, particularly amid the COVID-19 pandemic. She also stresses that terms like “social media addiction” are not formally recognized by psychiatrists and psychologists.
Other social media apps have recently introduced new policies designed to protect teen users’ privacy and cut down on overexposure.
TikTok this year reinforced certain limits on video downloads and direct messaging for underage users, while push notifications are also paused during night hours for those users.
Snapchat’s CEO Evan Spiegel said in October the company is exploring an in-house parental control system called the “Family Center.” The system will give parents “better insights to help protect their kids, in ways that don’t compromise their privacy or data security,” according to a company statement.
Comments