Social media company Instagram has said it will remove images, drawings and cartoons showing methods of self-harm or suicide from its platform.
It extends measures announced in February, which banned ‘graphic images of self-harm’ and restricted those with suicidal themes. This included both stills and videos.
The move follows the death of British 14-year-old Molly Russell who killed herself in 2017 after viewing graphic content on the platform.
Her father Ian said he believed Instagram was partly responsible for her death.
Molly Russell was found dead in her bedroom in 2017 after viewing self-harm images on Instagram
Speaking at the NSPCC’s How Safe Are Our Children? conference in June, Mr Russell said: ‘It is important to acknowledge that they (technology firms) do a lot of good, but sadly their platforms are being used by people to do harm and they have not done enough to prevent that.
‘Unless change happens, their platforms will become toxic.’
Mr Russell has described Instagram’s latest commitment as ‘sincere’ but said the company needed to act more swiftly.
The Facebook-owned app’s latest promise covers explicit drawings, cartoons and memes about suicide, along with any other method ‘promoting’ self-harm.
Instagram chief Adam Mosseri told BBC News: ‘It will take time to fully implement… but it’s not going to be the last step we take.’
In February, Instagram banned ‘graphic images of self-harm’ and restricted those with suicidal themes, including both pictures and videos.
When Molly died, her father found graphic content about self-harm and suicide on her Instagram account, and similar material on her Pinterest account.
Father Ian Russell welcomed the move but calle don Instagram to go further and remove more content
Instagram chief Adam Mosseri told BBC News: ‘It will take time to fully implement… but it’s not going to be the last step we take’
Instagram says it has doubled the amount of material removed related to self-harm and suicide since the start of this year.
Between April and June 2019, it said it had removed 834,000 pieces of content, 77 per cent of which had not been reported by users.
‘There is still very clearly more work to do, this work never ends,’ Mr Mosseri said.
Mr Russell responded: ‘I just hope he delivers.’
Instagram’s latest announcement coincided with a visit by Mr Russell to Silicon Valley. There, he told BBC News his daughter had entered a ‘dark rabbit hole of depressive suicidal content’ before taking her own life.
Mr Russell said: ‘It’s very easy to understand why many parents in the UK today really don’t know what they should or shouldn’t do to try and keep their children safe online because it is so easy for them to come across harmful content online and it is quite a new problem.
‘Molly had some older sisters; our middle daughter is only three years older than Molly and the difference that three years makes is huge.
‘The amount of harmful content that became available online and the earlier access that smartphones provided to youngsters to that content has made it so much harder to protect our children.
‘I think Molly probably found herself becoming depressed.
‘She was always very self-sufficient and liked to find her own answers. I think she looked towards the internet to give her support and help.
‘She may well have received support and help, but what she also found was a dark, bleak world of content that accelerated her towards more such content.’
Mr Russell told BBC News: ‘The algorithms the platforms use push similar content towards you, so if you have spent time looking at a particular subject it encourages you stay on their platform by providing you with further such content.
‘I think Molly entered that dark rabbit hole of depressive suicidal content.
‘Some were as simple as little cartoons – a black and white pencil drawing of a girl that said ‘Who would love a suicidal girl?’.
‘Some were much more graphic and shocking.
‘All in all they added up to something that led to Molly to a very dark and dangerous place indeed.’
Mr Russell said he was ‘really pleased’ that Instagram was taking a positive step forward in removing harmful posts.
Ian Russell has said the algorithms used by Instagram enabled Molly to view more harmful content, possibly contributing to her death. Pictured: Molly on holiday aged six and aged 11
But he added: ‘It would be great if they could find a way to take down 10 times the number of posts and really reduce the potentially harmful content that is on their platform.
‘This is serious. I honestly believe that by doing so they will save young lives and stop other youngsters ending up like Molly did.’
Mr Russell told BBC News: ‘I think it is really important for families to discuss their internet use quite openly.
‘It is something that shouldn’t just be talked about when there are problems. It is ‘What are you doing online? What are you finding? Which apps are you using and how are you using them?’ And trying to find a way to encourage an open dialogue between parents and children.’
‘It is hard to do, but if it isn’t there it might allow someone to find content and disappear into that digital rabbit hole.
‘And then it is very difficult for a parent to know how that youngster might be being affected by what they are viewing online.
‘Molly was a lovely daughter. We all miss her very much, but the thing perhaps she best liked doing was helping others.
‘I think we all get comfort in knowing that, even though she is not with us any longer, she is somehow finding a way to help other youngsters remain safe and get the support and help they need if they are having mental ill-health problems.’
For confidential support, log on to samaritans.org or call the Samaritans on 116123.