Facebook doubles down on groups with new content moderation tools
One feature will allow group administrators to temporarily slow down comments on a post.
Facebook is releasing new tools for people who manage groups on the social network, including features that are designed to make it easier to moderate content.
The move shows that Facebook is continuing to invest in groups, online spaces where people gather to chat about their hobbies, politics and various topics. More than 1.8 billion people use Facebook groups every month, and the company has been trying to get more users to join a group.
The rising popularity of groups, though, has also prompted more concerns that the social network isn't doing enough to crack down on disinformation about politics, the coronavirus and other issues from spreading in these online spaces.
The social network said it will allow group administrators -- Facebook users who manage a group -- to slow down comments on a post. For example, the administrator could limit comments to once every five minutes. The administrator can also temporarily limit the amount of comments or posts a Facebook users make in a group. They will also be given the ability to restrict newer members from posting or commenting, and to decline posts with certain promotional links. Facebook said there are more than 70 million active administrators and moderators running Facebook groups.
The company is also testing a new tool called "conflict alerts" that uses artificial intelligence to let administrators know if there's a contentious or unhealthy conversation happening in the group such as bullying or harassment. Administrators can appeal if their content or posts they approved for members were removed for violating Facebook's rules. When someone reports a post in a group, they will be able to tag what rules were violated, which could make the moderation process easier.
The social network is making groups easier to manage by putting the tools, settings and features in one place, allowing them to quickly see what content needs to be more moderated. Administrators will be able to view a summary for each member that will provide information about how many times they posted, commented or when they've had content removed for violating rules.