Facebook said Wednesday it will demote all content posted in Groups from users who have broken the site's rules, making potentially problematic content harder for others to find. The social network will also let people who manage groups know when content from members has been flagged by Facebook and will offer administrators the ability to appeal before a post gets removed.
The new safety features builds on the changes Facebook announced earlier this year for Groups, an online space where people can publicly or privately post about specific topics. The social network has faced more scrutiny to crack down on Groups because they've been abused to spread COVID-19 vaccine misinformation, hate speech and other harmful content. It's unclear, though, how effective these new efforts have been.
In March, the company announced several changes aimed at curbing the spread of problematic content, including warning users if they're about to join a Group that has violated Facebook's rules. Users who violate the platform's community standards are also restricted from posting, commenting, adding new members to a Group or creating new Groups. Facebook has been leaning more on administrators and moderator of Groups, requiring them to approve posts if a substantial amount of their members violated the platform's rules or were part of other Groups that were removed.
Now the company said it's involving administrators earlier in the moderation of content before it's shared in a Group. The social network has faced accusations it censors conservative content, allegations the company denies. When content is flagged by Facebook, administrators will have the ability to appeal before the content gets removed or take down the content themselves if they agree it violates the social network's rules.