Best Cyber Monday deals still available COVID variant: What is omicron? Jack Dorsey steps down as Twitter CEO Apple Music Awards PS5 restock tracker Google Doodle honors Lotfi Zadeh, father of fuzzy logic

Facebook apps used in more than 5,000 child grooming crimes, says UK charity

Since 2017, police in the UK recorded 24 online grooming crimes every week where offenders used Facebook-owned sites.

gettyimages-617719590

Facebook-owned platforms were used in 53% of child grooming crimes.

Keiko Iwabuchi/Getty

A UK charity is urging Facebook to disclose its internal research on child abuse incidents, after it discovered that police in the UK recorded 5,120 child grooming crimes on Facebook-owned apps since 2017. According to the research, which was published by children's charity the NSPCC on Monday, 53% of all online grooming crimes took place on Facebook platforms, including Instagram and WhatsApp, amounting to 24 incidents per week.

The NSPCC obtained the figures through freedom of information requests to police forces across England and Wales regarding Sexual Communication with a Child offenses, which have been defined by law in the UK since 2017. Due to the number of grooming crimes that go unreported, the charity said in a press release it believes the figures are "just the tip of the iceberg."

In a statement, a spokeswoman for Facebook described child grooming as "abhorrent." "We work quickly to find it, remove it and report it to the relevant authorities," she said. "We also block adults from messaging under 18s they're not connected with and have introduced technology that makes it harder for potentially suspicious accounts to find young people."

Facebook has perhaps never been under more scrutiny than at this time. On the same day the NSPCC published its research, Facebook whistleblower Frances Haugen testified before a UK parliamentary committee analyzing the country's incoming Online Safety Bill.

The Online Safety Bill, previously known as the Online Harms Bill, is a key piece of legislation that would place UK media watchdog Ofcom in charge of regulating social media platforms in the name of keeping users safe. Ofcom would have the power to fine tech companies £18 million ($25.3 million) or 10% of their annual revenue, whichever is higher, if they fail to remove harmful or illegal content, as well as to block sites and services. Senior managers at tech companies could even face criminal charges if those companies consistently fall short of their obligations.

The publication of the research also coincided with multiple reports on a cache of leaked internal documents from the company known as the Facebook Papers.

Facebook's ubiquity and the popularity of its various platforms means that it can't avoid being drawn into discussions of the misuse of digital platforms. Last week, the NSPCC and almost 60 other global child protection organizations wrote to Facebook CEO Mark Zuckerberg asking him to publish internal research about how abusers may be using the company's platforms to harm children.

"Instead of scribbling defensive blogs and setting their PR machine on journalists, [Facebook executive] Nick Clegg and Mark Zuckerberg must now publish all their research into how their platforms contribute to harm and sexual abuse and step up their efforts to fix their sites so they are safe for children," said NSPCC Chief Executive Peter Wanless in a statement.

The Facebook spokeswoman said the company is committed to keeping people safe and has spent $13 billion in recent years on building safety tools. "We've shared more information with researchers and academics than any other platform and we will find ways to allow external researchers more access to our data in a way that respects people's privacy," she said.