X

Facebook says it's making it tougher for users who break rules to create new groups

The social network will also stop recommending health groups.

Queenie Wong Former Senior Writer
Queenie Wong was a senior writer for CNET News, focusing on social media companies including Facebook's parent company Meta, Twitter and TikTok. Before joining CNET, she worked for The Mercury News in San Jose and the Statesman Journal in Salem, Oregon. A native of Southern California, she took her first journalism class in middle school.
Expertise I've been writing about social media since 2015 but have previously covered politics, crime and education. I also have a degree in studio art. Credentials
  • 2022 Eddie award for consumer analysis
Queenie Wong
3 min read
facebook-logo-phone-laptop-3020

Facebook announced new steps it's taking to make groups more safe.

Angela Lang/CNET

Facebook has been making a stronger push to get people to join groups, which are public and private online spaces on the social network where users gather to discuss shared interest such as cooking, sports or parenting. But people have also used Facebook groups to share conspiracy theories, vaccine misinformation and hate speech, raising concerns about whether the company is doing enough to moderate content in these online spaces.

On Thursday, Facebook outlined several steps it's taking to make groups safer, including making it tougher for users who violate the site's rules to create new groups. The move comes as civil rights groups, celebrities, advertisers and even its own employees criticize the company for how it enforces its rules against hate speech and how quickly it takes action against offensive content.

Facebook already bars users who manage groups from creating new groups that are similar to the ones that the company pulled for violating the site's rules. Now the company says it will prevent administrators and moderators of groups who have been removed from creating any group, not just ones about similar topics for 30 days.

Facebook users who have violated any of the company's rules will also be required to get new posts approved from an administrator or moderator for 30 days before it appears in a group.  If administrators or moderators repeatedly approve posts that violate Facebook's rules, the company said it will remove the group.

Groups that don't have an active administrator to oversee the online space will also be archived on the social network, which means users will still be able to see the content, but they can't post anything new in the group. Facebook will stop recommending health groups to users, but you'll still be able to search for them on the social network. Health misinformation, especially about vaccines, has been a bigger concern after the outbreak of the novel coronavirus. 

Facebook has been promoting groups as more users shift to more private spaces online to chat with new people or talk to their family and friends. While Facebook's rules apply to groups, this shift has sometimes made it tougher for the company to moderate content. Some anti-vaccination Facebook groups, for example, have a higher level of privacy where members have to be approved in advance to join, The Guardian reported last year. That could make it tougher for others to flag posts that they think violate Facebook's rules. Pulling down content can also be like a game of whack-a-mole for social networks.

In late June, Facebook said that it pulled down 220 Facebook accounts, 95 accounts on Facebook-owned Instagram, 28 pages and 106 groups tied to the boogaloo movement, a far-right extremist movement. Two boogaloo members conspired in a Facebook group to murder federal security guards in Oakland, California, according to the Federal Bureau of Investigation. CNET also reported on a private "Justice for George Floyd" group that was filled with racist content. Facebook didn't remove the group after it was brought to their attention, but it's no longer visible to the public as of Wednesday.

In August, Facebook took down 790 groups, 100 pages and 1,500 ads tied to a far-right conspiracy theory called QAnon that falsely claims there's a "deep state" plot against President Donald Trump and his supporters.

For the first time, Facebook also revealed how much hate speech content it removes from groups. The company relies on a mix of technology and user reports to find prohibited content. Over the last year, Facebook removed 12 million pieces of content in groups for hate speech and 87% was flagged before a user reported the posts. The company said it took down about 1.5 million pieces of content in groups for organized hate and 91% was found proactively. Facebook said it pulled down 1 million groups for violating these policies. 

Facebook defines hate speech as "a direct attack on people based on what we call protected characteristics — race, ethnicity, national origin, religious affiliation, sexual orientation, caste, sex, gender, gender identity, and serious disease or disability." You can't compare Black people to apes, for example, refer to women as objects or use the word "it" when describing transgender or non-binary people, according to the site's community standards.  

The amount of content that Facebook removed represents a fraction of the posts in groups. More than 1.4 billion people use Facebook groups every month and there are more than 10 million groups on Facebook.