Tinder and Grindr are facing stark questions about their efforts to keep children safe, following a report of multiple incidents of rape of minors.
Jeremy Wright, the UK's secretary of the Department for Digital, Culture, Media and Sport, will ask these dating app companies how they verify users' ages.
"This is truly shocking and yet more evidence that online tech firms must do more to protect children. I will be writing to these companies asking what measures they have in place to keep children safe from harm, including verifying their age," he said Monday in an emailed statement. "If I'm not satisfied with their response, I reserve the right to take further action."
Wright's scrutiny follows a Sunday Times report that revealed UK authorities have investigated more than 30 incidents of child rape since 2015 after victims evaded age checks on dating apps.
There were another 60 other cases of child sexual offenses via online dating services, according to the UK-based Sunday Times, which cited data released under freedom of information laws. These included grooming, kidnapping and violent sexual assaults of victims as young as 8. "Grooming" is the term for building an emotional tie to children in order to create trust and then sexually exploit them.
Tinder asserted that it was for people over the age of 18, and that it uses "industry-leading automated and manual moderation and review tools, systems and processes" to keep minors off its app.
"The bottom line is this: We are consistently evaluating and refining our processes to prevent underage access, and will always work with law enforcement, where possible, to protect our users as well," a spokesperson said in an emailed statement. "We don't want minors on Tinder. Period."
Grindr said it was "saddened" by the revelations in the report.
"Grindr is committed to creating a safe and secure environment to help our community connect and thrive, and any account of sexual abuse or other illegal behavior is troubling to us as well as a clear violation of our terms of service," a company spokesperson said Monday in an emailed statement.
"We encourage users to report improper or illegal behavior either within the app or directly via email to email@example.com," the spokesperson continued. "In addition, our team is constantly working to improve our digital and human screening tools to prevent and remove improper underage use of our app."
Last week, Instagram took its own child safety measures by pledging to ban images of suicide and self-harm after the family of UK teenager Molly Russell blamed the social network for her death.
First published Feb. 11 at 6:52 a.m. PT.
Updated Feb. 12 at 2:12 a.m. PT: Adds Tinder statement.