Internet Services

Facebook drags feet removing pro-Islamic State images

Content praising terrorist incidents and encouraging attacks lingered on the social network even after it was reported, a Times investigation revealed.

Ahmad Al-Rubaye/AFP/Getty Images

Facebook's moderation system is under fire once again.

The Times newspaper conducted an investigation in which a reporter using a fake account pointed out extremist images to the social network, but Facebook failed to remove them, Reuters reported Thursday.

The Times reported the presence of images and videos that glorified the Islamic State, praised the recent terrorist attacks in London and Egypt, and depicted graphic child abuse. Facebook removed some of the images, but left pro-jihadist posts calling for further attacks up on the site.

Facebook's process for removing posts that contain illegal or inappropriate content has been under scrutiny for years, and historically the social network has responded positively to criticism, apologizing for errors and attempting to put new safety features in place. But holes remain, and frequently it is up to journalists to uncover them.

Last month a BBC report showed that Facebook had failed to take down sexualized child images when they were reported. On top of that, when the BBC sent the content over to Facebook at the social network's request, the company reported the BBC to the police.

On this occasion, Facebook apologized for its failures and said it has now removed the content, which would be counted as illegal under UK law.

"We are grateful to The Times for bringing this content to our attention," said Justin Osofsky, Facebook's vice president for global operations. "We have removed all of these images, which violate our policies and have no place on Facebook."

The company is on a continual mission to improve its reporting process, but this is not just a goodwill gesture. If Facebook and other hosting platforms fail to remove illegal images, they could be subject to criminal prosecution in the UK for breaching terrorism laws. As well as reporting images directly to Facebook, internet users in the UK can report them to the Metropolitan Police's Counter Terrorism Internet Referral Unit.

Facebook has made great strides in particular in removing child abuse imagery from its platform, according to Susie Hargreaves, CEO of the Internet Watch Foundation, who described the social network as "one of our most actively engaged members." Earlier this month it started to apply the photo-matching tech it uses for identifying these images to fighting the problem of revenge porn. But some content is still slipping through the net.

"It is clear that we can do better, and we'll continue to work hard to live up to the high standards people rightly expect of Facebook," said Osofsky.

CNET Magazine: Check out a sampling of the stories you'll find in CNET's newsstand edition.

Tech Enabled: CNET chronicles tech's role in providing new kinds of accessibility.