Instagram self harm suicide cutting posts molly russell – Breaking News & Latest Updates 2026
Skip to main content

Instagram head admits platform has a problem policing self-harm posts

It’ll introduce sensitivity screens this week

It’ll introduce sensitivity screens this week

Photo by Amelia Holowaty Krales / The Verge

Instagram hasn’t effectively protected users from self-harm and suicidal content, Adam Mosseri, the head of the company, admits in an op-ed today, and he says that the company is working to remedy that.

Mosseri writes in The Telegraph that the death of 14-year-old Molly Russell in 2017 moved him and pushed the company to take a deeper look at its self-harm content screening. Russell died by suicide, and her family says she followed multiple self-harm and suicide Instagram accounts, which led her to kill herself. After hearing Russell’s story, and after UK health secretary Matt Hancock issued a warning to tech giants about their handling of these issues, Mosseri and his team began a “comprehensive review” of how the platform handles self-harm content.

Hancock threatened to use the law to force tech companies to protect children against this content

He says the platform bans posts that promote self-harm or suicide, but that it struggles to independently detect and police them all. “The bottom line is we do not yet find enough of these images before they’re seen by other people,” he writes. Right now, the platform mostly relies on the community to report offending posts, but the company is investing in technology to better identify these images before they reach followers. It’s also working to make them less discoverable.

According to Mosseri, the company has trained engineers and content reviewers on how to find these posts and has put measures in place to stop related image, hashtag, account, and type-ahead suggestions. Mosseri makes clear that Instagram will still allow posts that talk about mental health struggles and suicide as long as they don’t promote it, but the platform won’t recommend those posts through search, hashtags, or the Explore tab.

Additionally, Instagram is applying sensitivity screens to all content the company reviews that involves cutting. Those screens hide the images and require users to tap through to see them.

“We deeply want to get this right and we will do everything we can to make that happen,” Mosseri writes.

If you or someone you know is considering suicide or is anxious, depressed, upset, or needs to talk, there are people who want to help.

In the US:

Crisis Text Line: Text HOME to 741-741 from anywhere in the US, at any time, about any type of crisis.

988 Suicide & Crisis Lifeline: Call or text 988 (formerly known as the National Suicide Prevention Lifeline). The original phone number, 1-800-273-TALK (8255), is available as well.

The Trevor Project: Text START to 678-678 or call 1-866-488-7386 at any time to speak to a trained counselor.

Outside the US:

The International Association for Suicide Prevention lists a number of suicide hotlines by country. Click here to find them.

Befrienders Worldwide has a network of crisis helplines active in 48 countries. Click here to find them.

Follow topics and authors from this story to see more like this in your personalized homepage feed and to receive email updates.