Facebook bars more than 4,000 people and groups the company considers dangerous, including white supremacists, militarized social movements and alleged terrorists. The Intercept on Tuesday published a leaked list of dangerous individuals and organization that Facebook doesn't allow on its platform, providing a glimpse into how the social network moderates content that could lead to violence offline.
More than half of the list consists of alleged foreign terrorists that are predominately Middle Eastern, South Asian and Muslim. Experts told The Intercept that the list, as well as Facebook's policy, suggest the company places harsher restrictions on marginalized groups.
Facebook has a three-tiered system that indicates the type of enforcement the company will take in regard to content. Terrorist groups, hate groups and criminal organizations are part of the most restrictive level, Tier 1. The least restrictive level, Tier 3, includes militarized social movements, which The Intercept said "is mostly right-wing American anti-government militias, which are virtually entirely white."
Brian Fishman, Facebook's policy director for counterterrorism and dangerous organizations, said in a series of tweets that the version of the list published by The Intercept isn't comprehensive. The list, he said, is constantly updated.
"Defining & identifying Dangerous Orgs globally is extremely difficult. There are no hard & fast definitions agreed upon by everyone," he said. Fishman also pointed out that terrorist organizations like the Islamic State group and al-Qaeda have hundreds of individual entities, many of which are listed as separate entries in order to "facilitate enforcement," skewing the numbers of entities from a particular region. The Tier 1 list, he said, includes more than 250 white supremacist organizations.
Facebook has faced pressure to be more transparent about its policy against dangerous individuals and organizations. In January, the oversight board tasked with reviewing the social network's content moderation overturned a decision to remove a post the company had said violated this policy, noting the "rules were not made sufficiently clear to users." The board also recommended that Facebook publicize its list of dangerous organizations and individuals or list examples.
Fishman said Facebook hasn't shared the list "to limit legal risk, limit security risks, & minimize opportunities for groups to circumvent rules" but is trying to improve the policy.