Two former Facebook content moderators have joined a lawsuit against the tech giant, alleging they suffered psychological trauma and symptoms of post-traumatic stress disorder caused by reviewing violent images on the social network.
The lawsuit, which seeks class-action status, alleges that Facebook violated California law by failing to provide thousands of content moderators with a safe workplace. The moderators were exposed to murders, suicides and beheadings that were livestreamed on Facebook, according to the lawsuit.
Former Facebook content moderators Erin Elder and Gabriel Ramos signed on to the amended lawsuit, which was filed Friday in a California superior court. The suit was originally filed in September by Selena Scola, a former Facebook content moderator who worked as a contractor at the tech company from June 2017 to March 2018.
"This case has uncovered a nightmare world that most of us did not know about. The trauma and harm that the plaintiffs, and others who do content moderation work, have suffered is inestimable," Steve Williams, a lawyer for the Joseph Saveri law firm, which is representing the content moderators, said in a statement. "The fact that Facebook does not seem to want to take responsibility, but rather treats these human beings as disposable, should scare all of us."
Facebook denied the allegations in a November court filing. It has argued that the case should be dismissed.
Scola was an employee of PRO Unlimited, a Florida staffing business that worked with Facebook to police content. The original suit named PRO Unlimited as a defendant, but the staffing company was dropped from the amended filing.
Elder worked as a Facebook content moderator from March 2017 to December 2017 through PRO Unlimited and Accenture, another staffing company. She "has experienced nightmares, hypervigilance around children, depression and pervasive sense of helplessness about her work as content moderator," the lawsuit states.
Ramos, who was employed by PRO Unlimited, Accenture, Accenture Flex and US Tech Solutions, worked as a Facebook content moderator from June 2017 to April 2018. He also suffered symptoms of PTSD after viewing images and videos of graphic violence, according to the amended lawsuit.
Concerns about the working conditions of Facebook content moderators have escalated recently amid reports of the toll those jobs are taking on workers. The social network, which has 15,000 content reviewers, outsources content moderation work to staffing firms such as Cognizant, Accenture and Genpact.
At the same time, Facebook has been under pressure to prevent hate speech, violence and other offensive content from spreading throughout the social network. It's defended its use of contract workers for the job and has pledged to improve support for content moderators.
Facebook didn't immediately respond to a request for comment.