The social network says it removed 1.9 million pieces of ISIS and al-Qaida related content.
"Any nongovernmental organization that engages in premeditated acts of violence against persons or property to intimidate a civilian population, government or international organization in order to achieve a political, religious or ideological aim."
That's the internal definition of terrorism at Facebook , according to a new blog post from the company.
It's all about the violence, not a group's political goals, writes Monika Bickert, Facebook's VP of global policy management and Brian Fishman, the company's head of counterterrorism policy. And either way, governments are generally exempt.
Why does that matter? Because it determines some of the posts you don't see in your Facebook feed, since the company's 200-person counterterrorism team removed them. (In the wake of the Cambridge Analytica privacy scandal, Facebook is under pressure to show that it can police itself.)
Facebook said Monday that it used the definition to delete 1.9 million pieces of ISIS and al-Qaida related content in the first quarter of 2018, twice as much as last quarter. The company says it found 99 percent of that content itself, instead of relying on user reports.
"We're under no illusion that the job is done or that the progress we have made is enough," writes Facebook. "Terrorist groups are always trying to circumvent our systems, so we must constantly improve."
Read next: Facebook, Cambridge Analytica and data mining: What you need to know
Also: Citizen Zuck: The making of Facebook's Mark Zuckerberg
(Disclosure: Sean's wife works for Facebook as an internal video producer.)