Instagram boss: We'll protect users from suicide and self-harm posts

The social network is adjusting its policies to prevent the vulnerable, in particular, from being exposed to content glorifying self-harm and suicide.

Katie Collins Senior European Correspondent
Katie a UK-based news reporter and features writer. Officially, she is CNET's European correspondent, covering tech policy and Big Tech in the EU and UK. Unofficially, she serves as CNET's Taylor Swift correspondent. You can also find her writing about tech for good, ethics and human rights, the climate crisis, robots, travel and digital culture. She was once described a "living synth" by London's Evening Standard for having a microchip injected into her hand.
Katie Collins
2 min read

Instagram is rethinking its approach to handling pictures posted on the platform that could be triggering to people with mental health problems.

Adam Mosseri, who heads the social network, pledged in an op-ed in the Daily Telegraph on Monday that he would do more protect vulnerable users from being exposed to content promoting suicide and self-harm.

Mosseri wrote the piece in part as a response to the death of British teenager Molly Russell, who took her own life in 2017. Russell's family discovered through her Instagram account that she'd been engaging with and posting content about depression and suicide, leading them to publicly state the social network was responsible for her death. Mosseri is due to meet with the UK's minister for health, Matt Hancock, on Thursday to discuss the issue in more detail after Hancock said he would consider banning platforms that don't act to remove problematic content

Instagram's current policy allows people to share images and captions relating to depression and self-harm in order to allow people to seek support and talk openly about mental health. But it does not allow content that promotes or encourages self-harm or suicide. The social network relies on users to report such content, but Mosseri acknowledged Instagram doesn't act quickly enough to find and remove these images.

The Facebook-owned company is now examining and rethinking it policies. "We've taken a hard look at our work and though we have been focused on the individual who is vulnerable to self-harm, we need to do more to consider the effect of self-harm images on those who may be inclined to follow suit," said Mosseri.

Changes are already underway at the social network. Mosseri described how engineers and content reviewers are making it harder to find self-harm images by preventing the recommendation of related images, hashtags, accounts and typeahead suggestions (the suggested search terms that pop up when you type them into the search bar).

Starting this week, the social network is applying sensitivity screens to obscure all images that include cutting, so people will not be able to see them unless they actively choose to. The social network is also aiming to be more supportive of those who post images indicating they might be struggling with these topics. It's looking for new ways to help people, including by connecting them with suicide prevention resources and organizations.

"Suicide and self-harm are deeply complex and challenging issues that raise difficult questions for experts, governments and platforms like ours," said Mosseri. "We deeply want to get this right and we will do everything we can to make that happen."

If you're struggling with negative thoughts, self harm or suicidal feelings, here are 13 suicide and crisis intervention hotlines you can use to get help.

You can also call these numbers:

US: The National Suicide Prevention Lifeline can be reached at 1-800-273-8255. 
UK: The Samaritans can be reached at 116 123. 
AU: Lifeline can be reached at 13 11 14.