Twitch said Tuesday that it's implementing a series of new safeguards to address child grooming on its streaming platform. In a blog post, the company announced the rollout of "mandatory phone verification requirements" to prevent users who are under 13 from opening accounts. It aims to prevent minors from sharing risky content and to clamp down on child predators seeking to harm and exploit children.
In addition to ramping up efforts in blocking younger users from joining the platform or livestreaming, Twitch is upgrading its staff's tools to prioritize reported incidents involving children under age 13. To monitor predatory behavior, the service disabled the use of certain search terms and updated the privacy settings on its Whispers messaging feature. Its acquisition of Spirit AI, a language processing tool, will further assist in finding and tracking harmful written messages. And Twitch has also expanded collaborations with law enforcement partners and organizations that work to identify and prevent online child endangerment.
Twitch noted that it cannot share its entire strategy so as to prevent bad actors from dodging its safeguards. But the company cited its September post that addressed child grooming threats and its ongoing work to shut that down on the platform. The statement followed a Bloomberg report that found nearly 2,000 alleged predator accounts that followed children. In some cases, users urged kids to perform vulgar acts in livestreams.