Social media app TikTok admitted to suppressing videos made by people it deemed as vulnerable to bullying, such as persons with disabilities and those in the LGBTQ2 community.
A report from German digital rights site Netzpolitik revealed the platform’s policy, citing leaked documents and a source from TikTok that said the company purposefully prevented content from certain users from going viral or reaching a larger audience in order to protect them from bullying.
An excerpt leaked from TikTok’s rulebook, posted by Netzpolitik, gave specific examples of what its moderators were told to look out for.
“Subject who is susceptible to bullying or harassment based on their physical or mental condition,” read the excerpt.
“Example: Facial disfigurement, Autism, Down Syndrome, Disabled people or people with some facial problems such as birthmark, slight squint and etc.”
The method of suppression was different for some users. Some had their videos prevented from showing outside their home country, whereas others had theirs removed from TikTok’s search algorithm, the “For You” page, after reaching a certain view count.
Some users were manually categorized onto a special restriction list known as “Auto R,” if they were considered especially vulnerable to bullying.
According to Netzpolitik, a “striking number” on the list had a rainbow flag on their profiles or identified as LGBTQ2. The list also reportedly included users who were “simply fat and self-confident.”
TikTok has since spoken publicly on the policy, admitting it was wrong.
“Early on, in response to an increase in bullying on the app, we implemented a blunt and temporary policy,” a TikTok statement to Global News read. “This was never designed to be a long-term solution, but rather a way to help manage a troubling trend until our teams and user-facing controls could keep up.
“While the intention was good, it became clear that the approach was wrong and we have long since changed the earlier policy in favor of more nuanced anti-bullying policies.”
At least one Chinese ‘secret police station’ based in Vancouver, civil rights group says
Passenger who fell from cruise ship treaded water for 20 hours to survive
TikTok did not indicate when the policy was dropped, but according to Netzpolitik, the moderation was in effect as late as September.
A source from TikTok with knowledge of its content moderation told Netzpolitik that staff at the company repeatedly pointed out flaws in the policy, but that concerns were subsequently dismissed by management.
Matthew Johnson, MediaSmarts’ director of education, said that TikTok’s approach was the first case he was aware of where action was taken by a platform to reduce harassment in turn led to its targets being less able to participate.
“From a digital literacy perspective, this is a really troubling approach to trying to prevent online harassment because it really is contributing to exactly the danger of online harassment,” said Johnson.
“And that is its targets will be silenced, that people who are subject to harassment online will be less likely and will be less able to participate.”
Johnson said that people with disabilities are also among the most underrepresented groups in media and that reducing their presence on social media only contributed to why people underestimate how common physical and mental disabilities are.
“They’re essentially putting all of the burden on protecting people from harassment on the target.”
TikTok and its owner, Chinese technology company Bytedance, has recently faced public scrutiny over a series of controversies in which users have alleged the platform removed politically sensitive content to China.
The most recent incident, in which a U.S. teenager posted a video criticizing China over the treatment and abuse of its Uighur Muslims, had one of her accounts suspended and that video briefly removed, prompted an apology from TikTok.
In November, U.S. lawmakers launched a probe into Bytedance’s acquisition of another social media app, Musical.ly, citing concerns that the Chinese company would be censoring politically sensitive content, as well on how it would store the personal data of users.
“I don’t doubt that it was coming from a genuine desire to protect people. But obviously, it was a very sort of condescending, paternalist way of doing it,” said Johnson.
“And again, it’s one that I think really unfairly places the burden on the people who are affected.”