TikTok Privacy Features Reportedly Exploited for Child Sexual Abuse Materials
A report in the Financial Times says the US Justice Department and Department of Homeland Security are investigating TikTok.
TikTok, the short-form video app and social media platform, might be under scrutiny by a pair of US government agencies concerning its handling of child sexual abuse materials, according to a report Friday by the Financial Times. Both the Department of Homeland Security and the Department of Justice are investigating TikTok, the report said.
An anonymous source told the FT that the US Department of Justice is reviewing how a specific TikTok privacy feature is being exploited by predators. "It is a perfect place for predators to meet, groom and engage children," Erin Burke, unit chief of the child exploitation investigations unit at Homeland Security's cybercrime division, told the publication.
When asked about the report, TikTok told CNET over email, "We're not aware of any of the government investigations as alleged by the Financial Times, but TikTok has a zero tolerance policy on CSAM. Upon reading this story, we reached out to HSI to begin a dialogue and discuss opportunities to work together on our shared mission of ending child sexual exploitation online -- just as we regularly engage with law enforcement agencies across the country on this crucial topic."
The Department of Homeland Security did not immediately respond to CNET's requests for comment. The Department of Justice declined to comment.
TikTok this week was named the most downloaded app in the world in 2022 so far. The FT notes that TikTok has struggled to deal with all the content generated by its users. ByteDance, the Chinese company that owns the app, has 10,000 moderators worldwide and has been hiring additional staff to handle content, according to the report.
In 2019, the UK investigated TikTok over how it handled children's data and whether it ensured kids' safety. That same year, TikTok paid $5.7 million to settle US Federal Trade Commission charges that it collected children's personal information.
Other social media platforms including Facebook, Snapchat and Twitter continue trying to find effective ways to deal with issues related to child sexual abuse, including photos and videos posted to their platforms. In 2021, for instance, Facebook began testing new tools to help prevent this content from showing up.
The Kids Online Safety Act was introduced in the US Senate in February and aims to require social media platforms to provide more safeguards for children. Some of those safeguards include limiting minors' use of the platforms and restricting the platforms' use of their personal data.