Moderators were told not to promote videos from users with "ugly facial looks." TikTok said these policies are no longer being used.
TikTok told content moderators to suppress videos from users who were too ugly or poor and asked them to censor certain political speech in livestreams.
Internal documents, published by The Intercept over the weekend, show that the social network instructed moderators to not promote videos from users who had "ugly facial looks" such as scars, too many wrinkles and fangs to the app's "For You" page. Getting a video highlighted on that page could help a user attract more views and followers. Users who had an "obvious beer belly" or shot their videos in a "shabby and dilapidated" environment such as a slum might have also been excluded from the page.
The revelations come as TikTok, owned by Chinese tech company ByteDance, faces national security concerns and scrutiny from US lawmakers. It's yet another glimpse into what the social network allows and bars on the popular short-form video app. TikTok, known for its quirky 15-second videos, is the latest social network to come under fire for what content it features prominently.
Another document showed that moderators were asked to censor certain political speech in live videos, according to The Intercept. That included "controversial" content such as broadcasts about "state organs" such as police and military.
This isn't the first time that leaked content rules have raised concerns about TikTok. Last year, The Guardian reported that the company told moderators to censor content such as Tiananmen Square and Tibetan independence that would hinder the aims of Chinese foreign policy. German site Netzpolitik reported in December that leaked documents showed TikTok was hiding content from people with disabilities as part of an effort to combat bullying.
A TikTok spokesperson said in a statement that the rules published by The Intercept are the same or similar to guidelines that were previously published by The Guardian and Netzpolitik. The Intercept, though, reported that the guidelines were in use until at least late 2019.
"As we told The Guardian and Netzpolitik last year when they originally reported this, the guidelines The Intercept published are no longer in use and were already out of use when The Intercept accessed them," a TikTok spokesperson said in a statement.
On Sunday, The Wall Street Journal reported that TikTok would no longer use moderators in China to monitor overseas content.
A TikTok spokesperson said in a statement that it expects to transfer this work from its Trust and Safety team to "local teams in the markets they cover within a few weeks."
"We are working to find job options within the company for the China-based employees," the spokesperson said. "These teams had been primarily helping with overnight coverage for some non-US regions."