CNET también está disponible en español.

Ir a español

Don't show this again

Mobile

Facebook wants to simplify privacy settings for groups

"Closed" or "secret" Facebook Groups will soon be a thing of the past.

facebook-logo-1

Facebook said Wednesday it's simplifying privacy settings for Groups.

Angela Lang/CNET

Facebook Groups, a place where people with common interests gather to chat, has three privacy settings, and that can be confusing for some users. 

Now the social media giant, which has been under fire for privacy mishaps, is trying to make it easier for users to figure out how these settings work.

Currently, Facebook Groups are called public, closed or secret. If a group is public, Facebook users can find it, view what users are posting and request to join. In closed groups, only current members can see who is in the group and what users are saying. Secret groups are even more exclusive. Only current members can find the group, and you need an invite to join. 

fb-groups-privacy-graphic

Facebook Groups will no longer be called "secret" or "closed."

Facebook

Now Facebook Groups will have only two privacy settings: public or private. A group that was called secret will be labeled private and hidden, which means only members can find the group. A closed group will be called private but visible, meaning anyone can find the group. In private groups, only members can see who is in the group and what's posted. 

The move highlights how Facebook continues to double down on Groups as more users share in the social network's private spaces.

"We're making this change because we've heard from people that they want more clarity about the privacy settings for their groups," said Jordan Davis, a product manager for Facebook Groups in a blog post on Wednesday. "Having two privacy settings - public and private - will help make it clearer about who can find the group and see the members and posts that are part of it."

Facebook's focus on the social network's private spaces, including messaging and a feature called Stories in which users post photos and videos that vanish in 24 hours, has raised concerns that it will make it tougher for the social network to moderate posts that violates its rules against hate speech, misinformation and other offensive content. People who oppose vaccines, for example, have used the social network to spread misinformation in closed groups where members have to be approved, according to The Guardian.

Davis said in the blog post that Facebook has more than 30,000 people on its safety and security teams and there are employees specifically tasked with helping to protect Facebook Group users from harm. Facebook also uses a mix of content moderators and artificial intelligence to detect offensive content. The company's rules apply to both public and private groups. 

Facebook looks at various factors to decide whether a group should stay up or get pulled down. That includes the focus of a group and whether the name or description of the group includes hate speech. If an administrator of a group approves a post that violates Facebook's rules the overall group will receive a strike.

Facebook said last year that more than 1.4 billion people on Facebook use Groups every month. 

Originally published August 14, 9 a.m. PT
Update, 10:57 a.m. PT: Includes more information about how Facebook moderates groups.