X

Facebook Parent Meta Sued in Kenya by Former Content Moderator

The lawsuit alleges that Meta and an outsourcing company used by the social network don't provide enough support and mental health care to content moderators.

Queenie Wong Former Senior Writer
Queenie Wong was a senior writer for CNET News, focusing on social media companies including Facebook's parent company Meta, Twitter and TikTok. Before joining CNET, she worked for The Mercury News in San Jose and the Statesman Journal in Salem, Oregon. A native of Southern California, she took her first journalism class in middle school.
Expertise I've been writing about social media since 2015 but have previously covered politics, crime and education. I also have a degree in studio art. Credentials
  • 2022 Eddie award for consumer analysis
Queenie Wong
3 min read
Meta and Facebook

The mental health effects of content moderation is a long-standing issue at Facebook.

James Martin/CNET

Daniel Motaung remembers watching a video of a beheading while he worked as an outsourced Facebook content moderator in Kenya. Viewing violent and graphic content, he said, ended up taking him to a place he never imagined.

"Now, I have a heightened fear of death because of the content that I've moderated on a daily basis. And because of that, my quality of life has changed drastically," he said during a virtual discussion Tuesday. "I don't look forward to going outside. I don't look forward to going in public spaces."

The discussion, titled "Facebook Content Moderation, Human Rights: Democracy and Dignity at Risk," came on the same day that attorneys for the former content moderator filed a lawsuit against Facebook parent company Meta and Sama, the outsourcing firm that partners with the social media giant for content moderation in Africa. The 52-page petition alleges that the companies violated the Kenyan constitution, accusing them of forced labor, human trafficking, treating workers in a "degrading manner" and union-busting. Motaung was fired from his job in 2019 after he tried to form a trade union, the lawsuit said.

The lawsuit, filed in Nairobi's employment and labor relations court, is the latest in ongoing criticism Meta has faced over the working conditions of content moderators. In 2020, the company reached a $52 million settlement after content moderators in the US sued Facebook for allegedly failing to provide them with a safe workplace. The social network, which has more than 15,000 moderators, has struggled to police offensive content in multiple languages worldwide.

Meta spokesperson Grant Klinzman declined to comment on the lawsuit. The company has previously said it takes its responsibilities to content reviewers seriously. It requires partner companies to provide competitive pay, benefits and support and that it routinely audits those companies. Suzin Wold, a spokesperson for Sama, said in a statement that the allegations against the company "are both inaccurate and disappointing." She said the company has helped lift more than 59,000 people out of poverty, has provided workers a competitive wage and is a "longstanding, trusted employer in East Africa."

The lawsuit alleges that Sama targets poor and vulnerable youth for content moderation jobs, coercing them into signing employment contracts before they really understand what the role entails. Motaung, who came from a poor family, was looking for a job to support his family after college and didn't know that content moderation could harm his mental health, the lawsuit said. He then suffered from post-traumatic stress disorder, severe depression, anxiety, a relapse in his epilepsy and vivid flashbacks and nightmares from moderating graphic content.

Content moderators aren't given enough mental health support, must deal with irregular pay and can't discuss their struggles with family and friends because they're required to sign a non-disclosure agreement, the lawsuit said.

"A Facebook moderator must make high-stakes decisions about extremely difficult political situations and even potential crimes -- and they do so in a workplace setting that treats their work as volume, disposable work, as opposed to essential and dangerous front-line work protecting social media users. In short, Facebook moderators sacrifice their own health to protect the public," the lawsuit said.

Motaung, who shared his story in February with Time, said Meta has passed the responsibility of protecting workers to outsourcing companies and is exploiting people for profit. 

A group of Facebook critics called the Real Facebook Oversight Board, as well as Foxglove and The Signals Network, hosted Tuesday's panel discussion. In a blog post, the groups urged Meta to offer outsourced content moderators the same level of pay, job security and benefits as its own employees. They're also asking Meta to make other changes such as to publicize a list of the outsourcing companies it works with for content moderation.

Motaung said he believes that content moderation can be improved and has his own ideas as someone who has done the job.

"I've actually accepted the destruction of my own mental health and life in general, so what I'm hoping to achieve is to change that because I believe that content moderators can be dealt with in a better way," he said.