X

Facebook prompt asks if you worry friends are becoming extremists

It'll also alert you if you may have been exposed to extremist content.

Steven Musil Night Editor / News
Steven Musil is the night news editor at CNET News. He's been hooked on tech since learning BASIC in the late '70s. When not cleaning up after his daughter and son, Steven can be found pedaling around the San Francisco Bay Area. Before joining CNET in 2000, Steven spent 10 years at various Bay Area newspapers.
Expertise I have more than 30 years' experience in journalism in the heart of the Silicon Valley.
Steven Musil
2 min read
004-facebook-app-logo-on-phone-2021

Facebook is looking to root out extremist content.

Sarah Tew/CNET

Social media has become a hotbed of extremism as political discourse has increased in recent years, leading Facebook to wonder if you're worried your friends or acquaintances on the network are becoming extremists.

The social media giant has begun serving prompts to some users in the US asking that very question, a company spokesperson said Thursday. It's also begun notifying people who may have been exposed to extremist content, according to screenshots shared on Twitter.

One of the alerts, shared on Twitter, asks: "Are you concerned that someone you know is becoming an extremist? We care about preventing extremism on Facebook. Others in your situation have received confidential support."

Another alert reads: "Violent groups try to manipulate your anger and disappointment. You can take action now to protect yourself and others."

Both alerts link to support resources for help.

Facebook, Google and Twitter have for years been under pressure to remove extremist content from their platforms before violence spills into the real world, but that focus intensified this year amid increased scrutiny for the role their platforms played in the buildup to the deadly riots at the US Capitol in January.

The pilot program is part of Facebook's Redirect Initiative, which aims to combat violent extremism on the site by redirecting people who search for hate or violence-related terms toward educational resources and outreach groups.

"This test is part of our larger work to assess ways to provide resources and support to people on Facebook who may have engaged with or were exposed to extremist content, or may know someone who is at risk," said a Facebook spokesperson in a statement. "We are partnering with NGOs and academic experts in this space and hope to have more to share in the future."

Facebook said the program is part of its commitment to Christchurch Call to Action, an international partnership among governments and tech companies that seeks to curb violent extremist content online following the massacre of 51 people at a mosque in New Zealand that was livestreamed.

Facebook said in February it had to remove an increased amount of content in the fourth quarter for violating rules against hate speech, harassment, nudity and other types of offensive content. It said it took action against 26.9 million pieces of hate speech content, up from 22.1 million in the third quarter.

But it also said the percentage of times a user sees hate speech, nudity and violent and graphic content on its platform is also dropping. There are seven to eight views of hate speech for every 10,000 views of content, Facebook said.