Instagram expands ban on images that depict self-harm or suicide
The company's head says the social app is trying to strike a "difficult balance."
Instagram is prohibiting more types of images related to self-harm or suicide following a public outcry that the Facebook-owned social network hasn't done enough to protect vulnerable teens from seeing these graphic images.
The photo-sharing app will no longer allow fictional depictions of such actions, including drawings, memes and graphic images from films or comics. It's also pulling down other images that may not depict self-harm or suicide but includes "associated materials or methods." Accounts sharing this type of content won't be recommended in search or other parts of the app.
Instagram's efforts to crackdown on self-harm and suicide content comes after the death of British teenager Molly Russell, who took her life in 2017. Russell had used Instagram to engage with and post content about depression and suicide, leading her family to blame the social network for her death.
Instagram said it's trying to strike a balance between safeguarding teens from disturbing content and allowing others to share their mental health experiences.
"The tragic reality is that some young people are influenced in a negative way by what they see online, and as a result they might hurt themselves. This is a real risk," Instagram head Adam Mosseri said Sunday in a blog post.
On the other hand, Instagram users also use the platform to talk about their mental health struggles, offering online support for others who are grappling with the same problems, Mosseri said. Barring users from sharing this type of content could also stigmatize mental health.
In February, Facebook-owned Instagram banned all graphic images of self-harm, such as cutting, and also said it would prevent nongraphic content, such as images of healed scars, from showing up in search, hashtags and the explore tab. In the first three months after that change, Instagram "removed, reduced the visibility of, or added sensitivity screens" to more than 834,000 pieces of content, Mosseri said in a blog post.
In an op-ed in the Daily Telegraph in February, Mosseri vowed that the company would do more to protect vulnerable users from seeing content promoting self-harm or suicide.
Despite Instagram's strengthened efforts, some users have still found suicide-related content on the platform, according to BBC News. Russell's father Ian has described Facebook's efforts to combat self-harm and suicide content as "sincere" but also urged for quicker action, telling the BBC that he hopes Mosseri "delivers" on his commitment.
Some mental health advocates, such as UK charity Mental Health Foundation, praised Instagram's latest move as a positive step but called on the company to do more.
"We would also like to see them try to support distressed users who are posting such content in the first instance," the Mental Health Foundation tweeted on Monday.
Mosseri signaled that barring more types of self-harm or suicide content aren't the only steps Instagram will be taking in the future.
"There is still very clearly more work to do, this work never ends," Mosseri told BBC News.
If you're struggling with negative thoughts or suicidal feelings, here are 13 suicide and crisis intervention hotlines you can use to get help.
You can also call these numbers:
US: The National Suicide Prevention Lifeline can be reached at 1-800-273-8255.
UK: The Samaritans can be reached at 116 123.
AU: Lifeline can be reached at 13 11 14.
Originally published Oct. 28 at 7:40 a.m. PT
Update, 11:24 a.m. PT: Includes remarks from BBC interview, a mental health group and more background.