TikTok reportedly restricts the reach of users with disabilities

Videos are tagged in special categories to prevent bullying, a report says.

Corinne Reichert Senior Editor
Corinne Reichert (she/her) grew up in Sydney, Australia and moved to California in 2019. She holds degrees in law and communications, and currently writes news, analysis and features for CNET across the topics of electric vehicles, broadband networks, mobile devices, big tech, artificial intelligence, home technology and entertainment. In her spare time, she watches soccer games and F1 races, and goes to Disneyland as often as possible.
Expertise News | Mobile | Broadband | 5G | Home tech | Streaming services | Entertainment | AI | Policy | Business | Politics Credentials
  • I've been covering technology and mobile for 12 years, first as a telecommunications reporter and assistant editor at ZDNet in Australia, then as CNET's West Coast head of breaking news, and now in the Thought Leadership team.
Corinne Reichert
2 min read

TikTok is reportedly limiting reach.

James Martin/CNET

TikTok hid videos posted by people with disabilities, as well as those made by LGBTQ and overweight users, German site Netzpolitik reported Monday, citing leaked documents and an internal source. No matter the content of the videos, they were marked as being uploaded by special users who could be at risk of being bullied, the report said.

The moderation rule is reportedly called "Imagery depicting a subject highly vulnerable to cyberbullying." 

Some users are even categorized into "not recommend" by moderators, including people with facial disfigurement, autism and Down syndrome, according to documents published by Netzpolitik. By being sorted this way, they're reportedly only visible to a few thousand people rather than globally.

TikTok said the policy was a temporary one that has been long since removed. It was originally put in place while the company was getting its team and protective measures in place to counter the trend of cyberbullying to certain users.

"While the intention was good, the approach was wrong and we have long since changed the earlier policy in favor of more nuanced anti-bullying policies and in-app protections," a TikTok spokesperson told CNET.

The report follows TikTok's apology last week for the "interest and confusion" surrounding an anti-Chinese video that went viral earlier this week. A young TikTok user had posted a makeup video while raising awareness about Uighur Muslim community being detained in China. TikTok said Wednesday it wanted to "clarify" and apologize for human error in removing the video.

TikTok, a social media platform where users post short videos, has been downloaded more than 1.5 billion times. The Chinese app is reportedly under investigation by the US over national security concerns.

Originally published Dec. 2, 4:34 p.m. PT.
Update, 5:35 p.m.: Adds comment from TikTok.

Watch this: Senators skeptical of TikTok, Twitter fails to meet revenue expectations