X

Instagram expands ban on images that depict self-harm or suicide

The company's head says the social app is trying to strike a "difficult balance."

Carrie Mihalcik Former Managing Editor / News
Carrie was a managing editor at CNET focused on breaking and trending news. She'd been reporting and editing for more than a decade, including at the National Journal and Current TV.
Expertise Breaking News, Technology Credentials
  • Carrie has lived on both coasts and can definitively say that Chesapeake Bay blue crabs are the best.
Queenie Wong Former Senior Writer
Queenie Wong was a senior writer for CNET News, focusing on social media companies including Facebook's parent company Meta, Twitter and TikTok. Before joining CNET, she worked for The Mercury News in San Jose and the Statesman Journal in Salem, Oregon. A native of Southern California, she took her first journalism class in middle school.
Expertise I've been writing about social media since 2015 but have previously covered politics, crime and education. I also have a degree in studio art. Credentials
  • 2022 Eddie award for consumer analysis
Carrie Mihalcik
Queenie Wong
3 min read
instagram-logo-1

Instagram CEO Adam Mosseri says the app is taking more steps to keep people safe. 

Angela Lang/CNET

Instagram  is prohibiting more types of images related to self-harm or suicide following a public outcry that the Facebook-owned social network hasn't done enough to protect vulnerable teens from seeing these graphic images. 

The photo-sharing app will no longer allow fictional depictions of such actions, including drawings, memes and graphic images from films or comics. It's also pulling down other images that may not depict self-harm or suicide but includes "associated materials or methods." Accounts sharing this type of content won't be recommended in search or other parts of the app.

Instagram's efforts to crackdown on self-harm and suicide content comes after the death of British teenager Molly Russell, who took her life in 2017. Russell had used Instagram to engage with and post content about depression and suicide, leading her family to blame the social network for her death.

Instagram said it's trying to strike a balance between safeguarding teens from disturbing content and allowing others to share their mental health experiences. 

"The tragic reality is that some young people are influenced in a negative way by what they see online, and as a result they might hurt themselves. This is a real risk," Instagram head Adam Mosseri said Sunday in a blog post.

On the other hand, Instagram users also use the platform to talk about their mental health struggles, offering online support for others who are grappling with the same problems, Mosseri said. Barring users from sharing this type of content could also stigmatize mental health. 

In February, Facebook-owned Instagram banned all graphic images of self-harm, such as cutting, and also said it would prevent nongraphic content, such as images of healed scars, from showing up in search, hashtags and the explore tab. In the first three months after that change, Instagram "removed, reduced the visibility of, or added sensitivity screens" to more than 834,000 pieces of content, Mosseri said in a blog post.

In an op-ed in the Daily Telegraph in February, Mosseri vowed that the company would do more to protect vulnerable users from seeing content promoting self-harm or suicide. 

Despite Instagram's strengthened efforts, some users have still found suicide-related content on the platform, according to BBC News. Russell's father Ian has described Facebook's efforts to combat self-harm and suicide content as "sincere" but also urged for quicker action, telling the BBC that he hopes Mosseri "delivers" on his commitment.

Some mental health advocates, such as UK charity Mental Health Foundation,  praised Instagram's latest move as a positive step but called on the company to do more. 

"We would also like to see them try to support distressed users who are posting such content in the first instance," the Mental Health Foundation tweeted on Monday.

Mosseri signaled that barring more types of self-harm or suicide content aren't the only steps Instagram will be taking in the future. 

"There is still very clearly more work to do, this work never ends," Mosseri told BBC News. 


If you're struggling with negative thoughts or suicidal feelings, here are 13 suicide and crisis intervention hotlines you can use to get help.

You can also call these numbers:
US: The National Suicide Prevention Lifeline can be reached at 1-800-273-8255. 
UK: The Samaritans can be reached at 116 123. 
AU: Lifeline can be reached at 13 11 14. 

Originally published Oct. 28 at 7:40 a.m. PT
Update, 11:24 a.m. PT: Includes remarks from BBC interview, a mental health group and more background.