Queenie Wong was a senior writer for CNET News, focusing on social media companies including Facebook's parent company Meta, Twitter and TikTok. Before joining CNET, she worked for The Mercury News in San Jose and the Statesman Journal in Salem, Oregon. A native of Southern California, she took her first journalism class in middle school.
ExpertiseI've been writing about social media since 2015 but have previously covered politics, crime and education. I also have a degree in studio art.Credentials
is making an internal civil rights task force permanent, COO
said in a blog post Sunday, a decision that grew out of an ongoing review of the civil rights impact of the social network's policies and practices. The task force, which includes key leadership and is to be chaired by Sandberg, will focus on Facebook's content policies, the fairness of its artificial intelligence, and issues regarding privacy and elections, areas Facebook has struggled with.
In her post, Sandberg said the social network is committed to recruiting people with civil rights expertise to serve on the task force. For example, it'll work with voting rights experts to ensure the social network isn't used to suppress or intimidate some voters.
The formalization of the task force, as well as recommendations on policing hate speech, new policies on advertisements and efforts to protect the integrity of elections and the 2020 census, were included in the company's second progress report on its civil rights audit, which was also published Sunday. The first installment was published in December, and a third and final report is expected in the first half of next year.
"We will continue listening to feedback from the civil rights community and address the important issues they've raised so Facebook can better protect and promote the civil rights of everyone who uses our services," Sandberg wrote in a draft of the post. The audit says that the task force will meet monthly.
Watch this: Facebook's civil rights task force, MacBook Air faults and mysterious space signals
The civil rights audit comes as Facebook wrestles with complaints that it's been used to target minority groups, stir white nationalism and discourage voting. Two years ago, 19 civil rights groups, including Color of Change, Muslim Advocates and the NAACP, expressed concern that Facebook had become a tool of Russian trolls seeking to divide the US. The groups also asked that Facebook bring in a third party to audit the civil rights impact of the company's policies. The company agreed in May 2018, and Laura Murphy, a civil liberties leader who worked with the ACLU for two decades, has spearheaded the review.
Tensions between Facebook and civil rights groups escalated last year after The New York Times reported that public relations firm Definers Public Affairs tried to discredit Facebook's critics by linking them to George Soros, a Jewish billionaire who's been the target of anti-Semitic and far-right conspiracy theories for championing progressive causes.
Civil rights groups praised Facebook's focus on the issue but criticized the company for being slow to act.
The Change the Terms coalition, which is made up of civil rights organizations, nonprofits and other groups, said it was expecting the company to combat hate speech more quickly in the wake of mass shootings at two mosques in Christchurch, New Zealand, that left 51 people dead and was livestreamed on Facebook.
"Facebook remains turtle-slow to change," said Henry Fernandez, senior fellow at the Center for American Progress and member of Change the Terms, in a statement. "Relying primarily on monthly meetings of executives and a couple of outside consultants with civil rights expertise is a step forward but insufficient."
The ongoing audit has already resulted in policy changes at Facebook, which has more than 2 billion users around the world. The December report showed the social network had beefed up efforts to combat voter suppression, as well as fake accounts designed to influence political views. In March, Facebook banned white nationalist and white separatist content, saying such content couldn't be "meaningfully separated from white supremacy and organized hate groups." The change was highlighted in Sunday's report.
Meanwhile, Facebook's technology, including AI, is getting better at recognizing hate speech on its own, the report said. As of March, Facebook removed more than 65% of hate speech that it identified before a user reported it, more than double the 24% figure from December 2017, according to the report. The company may have some of its content moderators specialize in hate speech so that posts warning of hate speech aren't inappropriately removed because they repeat problematic content.
The report also addressed changes to Facebook's ad targeting system, including adjustments that make it more difficult for advertisers to exclude some groups from receiving housing, employment and credit ads. It also listed efforts to protect elections and encourage participation in the census.
In addition to the task force, Sandberg said the company would provide civil rights training to key employees working on relevant products and policies. The training is meant to increase awareness of civil rights issues and build them into decisions.
"We know these are the first steps to developing long-term accountability," Sandberg wrote. "We plan on making further changes to build a culture that explicitly protects and promotes civil rights on Facebook."
Originally published June 30 Update, July 1: Includes comments from civil rights groups