Instagram boss says social media site is ‘not yet where it needs to be’ on self-harm and suicide

The boss of Instagram has admitted the social media platform is ‘not yet where it needs to be’ on its handling of content around self-harm and suicide.

Adam Mosseri said a comprehensive review had been launched amid accusations the firm was ‘normalising’ self-harm and risked the creation of a ‘suicide generation’.

Mr Mosseri said the recent case of 14-year-old Molly Russell, whose father said she took her own life after looking at self-harm posts, had left him ‘deeply moved’.

He added that Instagram was investing in technology to better identify inappropriate images and would also begin using sensitivity screens to hide images from view until users actively choose to look at them. 

Molly Russell

Instagram boss Adam Mosseri (left) said the case of 14-year-old Molly Russell (right), whose father said she took her own life after looking at self-harm posts, had left him ‘deeply moved’

Writing in the Daily Telegraph, he said: ‘We need to do everything we can to keep the most vulnerable people who use our platform safe.

‘To be very clear, we do not allow posts that promote or encourage suicide or self-harm.

‘We rely heavily on our community to report this content, and remove it as soon as it’s found.

‘The bottom line is we do not yet find enough of these images before they’re seen by other people.’

His comments come as social media and technology firms face increasing scrutiny over their practices.

Health Secretary Matt Hancock said last week legislation may be needed to police disturbing content on social media.

Mr Mosseri said a comprehensive review had been launched amid accusations the firm was 'normalising' self-harm and risked the creation of a 'suicide generation'

Mr Mosseri said a comprehensive review had been launched amid accusations the firm was ‘normalising’ self-harm and risked the creation of a ‘suicide generation’

Meanwhile, separate reports by the Children’s Commissioner for England and the House of Commons called on firms to take more responsibility for their content. 

Algorithms on Instagram mean that youngsters who view one account glorifying self-harm and suicide can see recommendations to follow similar sites.

Experts say some images on the website, which has a minimum joining age of 13, may act as an ‘incitement’ to self-harm.

Instagram’s guidelines say posts should not ‘glorify self-injury’ while searches using suspect words, such as ‘self-harm’, are met with a warning. But users are easily able to view the pictures by ignoring the offers of help. 

‘Instagram helped kill my daughter’: Father of tragic Molly Russell, 14 

Molly Russell, 14, was found dead in her bedroom in November 2017 after showing ‘no obvious signs’ of severe mental health issues.

Her family, of Harrow, north west London, later found she had been viewing material on social media linked to anxiety, depression, self-harm and suicide.

The teenager’s father Ian criticised Instagram and Pinterest for hosting ‘harmful’ images he said may have played a part in her death.

He said: ‘I have no doubt that Instagram helped kill my daughter. She had so much to offer and that’s gone.’ 

Molly, who went to Hatch End High School in Harrow, Middlesex, had started viewing disturbing posts on social networks without the family’s knowledge.

It was only after her death in 2017 that the teenager’s parents delved into her social media accounts and realised she was viewing distressing images.

Mr Mosseri added: ‘Starting this week we will be applying sensitivity screens to all content we review that contains cutting, as we still allow people to share that they are struggling even if that content no longer shows up in search, hashtags or account recommendations.

‘These images will not be immediately visible, which will make it more difficult for people to see them.

‘We want to better support people who post images indicating they might be struggling with self-harm or suicide.

‘We already offer resources to people who search for hashtags, but we are working on more ways to help, such as connecting them with organisations we work with like Papyrus and Samaritans.

‘We have worked with external experts for years to develop and refine our policies. One important piece of advice is that creating safe spaces for young people to talk about their mental health online is essential.

‘Young people have also told us that this is important, and that when the space is safe, the therapeutic benefits are positive.’

He said the site did not want to ‘stigmatise mental health’ by deleting images which reflect the issues people were struggling with, but would not stop recommending them in searches, via hashtags or the app’s Explore tab.

‘Suicide and self-harm are deeply complex and challenging issues that raise difficult questions for experts, governments and platforms like ours,’ Mr Mosseri wrote.

‘How do we balance supporting people seeking help and protecting the wider community? Do we allow people to post this content they say helps them or remove it in case others find it?

‘This week we are meeting experts and academics, including Samaritans, Papyrus and Save.org, to talk through how we answer these questions.

‘We are committed to publicly sharing what we learn. We deeply want to get this right and we will do everything we can to make that happen.’

Molly Russell (pictured) was found dead in her bedroom in November 2017 after showing 'no obvious signs' of severe mental health issues

Molly Russell (pictured) was found dead in her bedroom in November 2017 after showing ‘no obvious signs’ of severe mental health issues

Read more at DailyMail.co.uk