CNET también está disponible en español.

Ir a español

Don't show this again

Internet Services

Facebook plays defense over concerns about content moderators' mental health

The social network stands by its hiring of contractors to review content, though it says there's room for improvement.

facebook-f8-mark-zuckerberg-2018-0234

Facebook CEO and co-founder Mark Zuckerberg.

James Martin

Facebook faces mounting concerns about the mental health of contractors who sift through the site for hate speech, violence and porn, even as the social network beefs up its effort to police offensive content.

On Monday, Facebook defended outsourcing the work Accenture, Cognizant and other companies. But the world's biggest social network acknowledged it had room to improve the conditions for contractors, some of whom reportedly suffer symptoms resembling post-traumatic stress disorder.

"A lot of the recent questions are focused on ensuring the people working in these roles are treated fairly and with respect," said Justin Osofsky, Facebook's VP of Global Operations, in a post made public after it was first shared with employees over the weekend. "We want to continue to hear from our content reviewers, our partners and even the media -- who hold us accountable and give us the opportunity to improve."

Facebook's remarks come after The Verge reported that Cognizant employees who work with the social network turned to sex, drugs and dark humor in the workplace to cope with reviewing content such as suicides and violence. Some of these employees started to believe conspiracy theories found in the videos they moderate. The Cognizant employees make about $28,800 per year, according to the report. 

One former employee told the publication he started to believe conspiracy theories, such as 9/11 wasn't a terrorist attack, after moderating that content. He also said he brought a gun to work and still sleeps with it nearby because fired employees threatened to harm their former co-workers. Employees are given breaks and "wellness time," but six employees told The Verge they found resources inadequate. To cope with the stress of their jobs, some employees had sex in the bathroom stalls, stairwells and other places in the workplace, according to the report. 

A Cognizant spokeswoman said in a statement that the company offers its employees support through onsite counselors, a wellness program and other tools.

"We have investigated the specific workplace issues raised in a recent report, previously taken action where necessary and have steps in place to continue to address these concerns and any others raised by our employees," Cognizant said.

The report is the latest to highlight concerns about how the social media site moderates content. Last year, a content moderator filed a lawsuit seeking class action status against Facebook alleging the company didn't do enough to protect the mental health of the workers. News outlets, including Wired, Motherboard and The Wall Street Journal, have also reported on the struggles that content moderators grapple with after reviewing disturbing content. 

With the blog post, Facebook tried to assure employees that it's been taking steps to address these concerns. 

Facebook has clear contracts, regular site visits to keep an eye on workplace conditions and business reviews with staffing firms that include what they're doing to support the wellness of their employees, the company said in a blog post.

The social network said it also will conduct audits of the companies it works with, standardize its contracts and host an event that brings together its partners. 

Contractors can also voice concerns to their employers' human resource department or to Facebook anonymously through a whistleblower hotline. 

"Put simply, after a couple of years of very rapid growth,"  Osofsky wrote in a post shared with employees, "we're now further upgrading our work in this area to continue to operate effectively and improve at this size."

Facebook, which has 2.3 billion users worldwide, has about 15,000 content reviewers.

Originally published at 1:25 p.m. PT

Update, 1:47 p.m. PT: Adds background from The Verge's report.