X

Facebook illustrates the rabbit hole of user reports

Tons of users have clicked the "Report/Mark as Spam" button when something went awry on their account or another user offended them. Facebook explains what happens on its end of that click.

Dara Kerr Former senior reporter
Dara Kerr was a senior reporter for CNET covering the on-demand economy and tech culture. She grew up in Colorado, went to school in New York City and can never remember how to pronounce gif.
Dara Kerr
2 min read
Facebook's Reporting Guide infographic. Facebook

Ever wondered what happens when unwanted activity is reported on Facebook, such as explicit photos, hate speech, or hacked accounts?

Facebook is aiming to make it easier for users to understand what it does when user reports are filed by publishing the "Reporting Guide" infographic today (see above).

"With a community of over 901 million people, Facebook maintains a robust reporting infrastructure made up of dedicated teams all over the world and innovative technology systems," the social network writes on the infographic.

Looking to help users who may be contemplating suicide or others who feel like they're being harassed or those who think their account has been taken over by an imposter, Facebook has hired hundreds of support staff. Housed in offices from Menlo Park, Calif. to Dublin, Ireland to Hyderabad, India, team members respond to reports in 24 different languages.

Here's Facebook's explanation on its teams:

In order to effectively review reports, User Operations (UO) is separated into four specific teams that review certain report types - the Safety team, the Hate and Harassment team, the Access team, and the Abusive Content team. When a person reports a piece of content, depending on the reason for their report, it will go to one of these teams. For example, if you are reporting content that you believe contains graphic violence, the Safety Team will review and assess the report. And don't forget, we recently launched our Support Dashboard, which will allow you to keep track of some these reports.

If one of these teams determines that a reported piece of content violates our policies or our Statement of Rights and Responsibilities, we will remove it and warn the person who posted it. In addition, we may also revoke a user's ability to share particular types of content or use certain features, disable a user's account, or if need be, refer issues to law enforcement. We also have special teams just to handle user appeals for the instances when we might have made a mistake.

Besides Facebook's staff and engineers, the social network has also partnered with outside groups and experts in cybersecurity, suicide prevention, and LGBT rights to help Facebook users that may be in need for additional services.