Facebook engineer quits, accuses social network of 'profiting off hate'

The social media giant is facing a backlash from its own employees over content moderation decisions.

Queenie Wong Former Senior Writer
Queenie Wong was a senior writer for CNET News, focusing on social media companies including Facebook's parent company Meta, Twitter and TikTok. Before joining CNET, she worked for The Mercury News in San Jose and the Statesman Journal in Salem, Oregon. A native of Southern California, she took her first journalism class in middle school.
Expertise I've been writing about social media since 2015 but have previously covered politics, crime and education. I also have a degree in studio art. Credentials
  • 2022 Eddie award for consumer analysis
Queenie Wong
3 min read

Another Facebook engineer left the company over concerns over how the social network handles hate speech. 

Angela Lang/CNET

A Facebook engineer left the social media giant on Tuesday and publicly criticized the company for not doing enough to combat hate speech. 

"I'm quitting because I can no longer stomach contributing to an organization that is profiting off hate in the US and globally," Ashok Chandwaney said in a post shared publicly on Facebook and internally with co-workers. The resignation highlights the backlash Facebook is facing from its own employees while under pressure from civil rights advocates and advertisers to tackle hate speech more aggressively. 

In the post, Chandwaney outlines several content moderation decisions that have raised concerns about whether the company is taking this problem seriously. 

In May, Facebook left up a post from President Donald Trump that included the remarks "when the looting starts, the shooting starts" because the company determined that it didn't violate its rules against inciting violence. Twitter, on the other hand, labeled Trump's tweet with the same remark for breaking its rules against glorifying violence. Facebook's decision to leave up Trump's post resulted in some employees staging a rare virtual protest while others left the company. Facebook also didn't remove a Kenosha Guard militia event that called for violence before a fatal shooting at a Wisconsin racial justice protest. The company pulled down a page for the militia group after the shooting for violating its rules and said it didn't act sooner because of an "operational mistake."

"The actions that have been taken are easy and could be interpreted as impactful because they make us look good, rather than impactful because they will make substantive change," Chandwaney wrote. 

Facebook spokeswoman Liz Bourgeois said the company doesn't "benefit from hate." "We invest billions of dollars each year to keep our community safe and are in deep partnership with outside experts to review and update our policies. This summer we launched an industry leading policy to go after QAnon, grew our fact-checking program and removed millions of posts tied to hate organizations -- over 96% of which we found before anyone reported them to us," she said. QAnon is a right-wing conspiracy theory that there's a so-called "deep state" plot against Trump and his supporters. 

Chandwaney, who is gender non-binary and uses "they" and "them" as pronouns, said in the post that the company's approach to hate has eroded their faith that Facebook will scrub this offensive content from its platform. Chandwaney, 28, criticized some of Facebook's other policies in an interview with The Washington Post. The social network doesn't send posts from politicians to third-party fact checkers even if they contain misinformation. "Allowing lies in election ads is pretty damaging, especially in the current political moment we're in," Chandwaney said in the interview. 

In an e-mail sent to CNET, Chandwaney called on Facebook to implement the recommendations in an independent civil rights audit of the company's practices and policies. The 89-page report makes several recommendations, including that Facebook remove humor as an exception to its rules against hate speech. 

"In response to many of their controversial decisions, Facebook's PR messaging defers responsibility to 'experts'-- yet the company has repeatedly refused to fully take action on their recommendations from the audit," Chandwaney said. 

Since they left the company, Chandwaney said they have been "surprised" how many others they've heard from with "similar concerns or reservations about Facebook and its decisions."

In July, more than 1,000 companies including big brands such as The North Face and Ben & Jerry's vowed to stop purchasing advertising from Facebook until the company does more to combat hate speech on its platform. The Stop Hate for Profit campaign outlines 10 steps it wants Facebook to take to better address hate speech on its platform including hiring a C-suite level executive with a civil rights background and notifying businesses if their ads are shown next to hate speech. 

Rashad Robinson, president of the civil rights advocacy group Color of Change, praised Chandwaney's decision to leave the company.

"In the absence of true leadership from Facebook to address hate and misinformation on the platform, Facebook employees are stepping up to push for progress and joining the movement to hold the world's largest social media company responsible for its harmful choices, hollow excuses, and its continual decision to profit from hate in order to keep the platform in political favor with those in power," he said in a statement.