TikTok is vowing to step up its resources for users who get caught up in frightening hoaxes on the app, after a new report found fewer than a third of teens recognize these hoaxes as “clearly fake.”
The report, which TikTok commissioned and supported, also found that while many teens are distressed by the scary hoaxes they see on the app, less than half are seeking out help afterward.
The hoaxes vary, but a common one includes seemingly baseless warnings about a wide-eyed, dark-haired woman known as “Momo” who threatens users who don’t do the violent tasks she demands of them. Another is based on a rumour about a 50-step challenge that starts innocuously, but ramps up to the final task — challenging users to commit suicide.
In response, TikTok says it plans to ramp up its monitoring efforts, Safety Centre resources, and its warning prompts for users.
What TikTok found
The report found that of the teens who were exposed to hoax challenges, 31 per cent believed it had a negative impact on them. Of those who experienced this negative impact, 63 per cent said the hoax affected their mental health.
Still, just 46 per cent of teens have sought support and advice afterward, according to the new report.
The findings were released on Wednesday in a report titled Exploring effective prevention education responses to dangerous online challenges. For the report, TikTok hired brand consultancy firm The Value Engineers (TVE) to conduct an online survey of 10,000 teens aged 13 to 19, parents and teachers from around the world about their experiences with online challenges and hoaxes.
No margin of error was provided for the survey results.
TikTok then hired Praesidio Safeguarding to compile key findings and issue recommendations in the form of this new report.
“The fact that less than half of teens are thinking about support and advice is perhaps something we need to address,” said Zoe Hilton, who is the director and founder of Praesidio Safeguarding.
“Hoax challenges” are defined by the report as a “specific subcategory of dangerous challenges where the element of challenge is fake, but they are designed to be frightening and traumatic and thus have a negative impact on mental health.”
Take the “Momo” challenge as an example. In this “hoax challenge,” the rumour is that a frightening woman with bulging eyes will pop up on users’ screens while they watch something harmless, like a cartoon.
The woman — which is actually an image of a statue from Japan, not a real person — is rumoured to tell the users something bad will happen if they don’t complete a challenge. Her request is said to potentially involve self-harm or even suicide, according to the stories.
Get daily National news
There’s no evidence that any teens have actually participated in this challenge, according to multiple reports.
Instead, it’s the hoax itself — the rumour that the challenge could pop up on your screen at any time — that can cause anxiety for teens, the report found.
“Everyone growing up has that sort of thing like Slenderman or any of those other ideas,” said Carmen Celestini, a professor at the University of Waterloo and a fellow working with The Disinformation Project at Simon Fraser University.
“But now, just because the medium has changed, it can become much more frightening.”
Safety changes
TikTok says it plans to go a step beyond removing the hoax videos themselves and will begin to remove “alarmist warnings” about the hoaxes that spread misinformation by “treating the self-harm hoax as real.”
“We will continue to allow conversations to take place that seek to dispel panic and promote accurate information,” said Alexandra Evans, TikTok’s head of safety public policy in Europe, in an emailed statement.
Evans added that TikTok has crafted technology that alerts the safety teams when there’s a sudden increase in rule-breaking content — whether it be hoaxes or dangerous challenges — that are linked to a specific hashtag.
“For example, a hashtag such as #FoodChallenge is commonly used to share food recipes and cooking inspiration, so if we were to notice a spike in content tied to that hashtag that violated our policies, our team would be alerted to look for the causes of this and be better equipped to take steps to guard against potentially harmful trends or behaviour,” she said.
The report also found that teens, parents and educators need better information about these challenges and hoaxes. To that end, Evans said TikTok has developed a new resource in its “Safety Centre” that is “dedicated to challenges and hoaxes.”
“This includes advice for caregivers that we hope can address the uncertainty they expressed about discussing this topic with their teens,” Evans said.
TikTok already has warning labels that pop up when users search something harmful. But now, those attempting to search for a harmful challenge or hoax will see “a new prompt” that will “encourage community members to visit our Safety Centre to learn more.”
“Should people search for hoaxes linked to suicide or self-harm, we will now display additional resources in search,” Evans said.
Experts worry it's not enough
While Celestini says she does sense “goodwill” from TikTok, she warned that the app still needs to step up its efforts when it comes to stopping the spread of hoaxes and conspiracy theories on the app.
“I think that they’re doing an OK job, but they really have to pay attention. There’s a lot of things that seep in, and the way the algorithms work, if you click one or two TikTok videos that you didn’t expect … the things that start coming up next can really change your trajectory and your adventure on TikTok,” Celestini said.
“They have to look at what is actually on their website … there has to be some responsibility for that.”
Practices by social media platforms that target and impact children and teens have been in the spotlight in recent weeks.
A U.S. Senate panel took testimony from a former Facebook data scientist in October, who laid out internal company research showing that Instagram appeared to seriously harm some teens. The subcommittee then widened its focus to examine other tech platforms that also compete for young people’s attention and loyalty — including TikTok.
TikTok, YouTube and Snapchat vowed to ensure young users’ safety at the hearings, but came under criticism by the U.S. panel for offering only “tweaks and minor changes” and not going far enough to mitigate potential harm to kids.
According to Celestini, “the onus is really on parents as well.”
Parents should ensure they have an “open conversation” with their children about social media, whether it be about frightening videos their teens might watch or their responsibility when it comes to the spread of disinformation and hoaxes.
“We just share, share, share and we don’t think about it. And that’s really how things get spread,” Celestini said.
As for teens who might find themselves feeling frightened by hoaxes on the app, Celestini said they should “get off TikTok” and head over “to Google” to help dispel what might be confusing or worrying them.
“You can find a lot of information there,” Celestini said.
The nature of TikTok’s algorithm is such that when you engage with content, it tends to serve up more of the same, Celestini added. She recommended teens try to break the cycle of hoax videos by actively searching for something less frightening, and “taking that time to feel what you feel.”
And, she added, “if you’re afraid, walk away from TikTok for a couple of days.”
—With files from the Associated Press
Comments