CNET también está disponible en español.

Ir a español

Don't show this again

Christmas Gift Guide
Internet

Facebook: Moderation tool flawed during BBC investigation

The social network says its abuse reporting tool wasn't working properly when the BBC called Facebook out for not removing sexualized photos of children.

gettyimages-586900316.jpg

Facebook says there was a flaw with an abuse review tool during a BBC investigation.

Jaap Arriens, NurPhoto via Getty Images

Facebook has been under fire after a report from the BBC last week called out the social network for not taking down sexualized images of children on its site.

On Tuesday, Facebook said the moderation tool that should have flagged the images during the investigation wasn't working, according to a follow-up report by the BBC.

"We welcome when a journalist or a safety organisation contacts us and says we think there is something going wrong on your platform," Facebook UK Director Simon Milner told members of Parliament, according to the report. "We welcome that because we know that we do not always get it right."

Facebook didn't immediately respond to a request for comment.

The social network, with its 1.86 billion members, has been grappling with what to show on its site. In addition to the controversy over sexualized children, it's also wrestled with censorship issues as it's distributed fake news stories, expanded its focus on live video and struggled with what to do with violence on live broadcasts.

Specifically, the BBC story pointed to 100 posts reported to the social network featuring sexualized images or comments about children. Only 18 were removed at that time, though they have since all been taken down.

Among the allegations from the BBC report: Facebook had groups created by pedophiles, an image that looked like a still frame from a child abuse video, and five accounts belonging to convicted pedophiles -- whom Facebook explicitly bans.