The lawyer representing Molly Russell’s family at the inquest into the 14-year-old’s death by ‘an act of self-harm’ said she had to seek professional help after trawling through thousands of ‘hopeless’ posts while working on the case.
Merry Varney, who represented Molly’s father Ian Russell, told Woman’s Hour that she had to seek guidance on how to protect her own mental health because the content she was being exposed to during the case was so harmful, and said a consultant psychiatrist who gave evidence to the court ‘couldn’t sleep for weeks afterwards’ because of the content.
The inquest into Molly’s death in 2017, held last week, heard how the teenager was exposed to content that ‘glamourised’ self-harm and would binge on suicide-related content online.
She saved and ‘liked’ 16,300 images on her Instagram account, 2,100 of which related to depression, self-harm and suicide in the last six months of her life.
The coroner ruled that the 14-year-old died ‘from an act of self-harm’ while suffering from depression and ‘the negative effects of online content’.
A coroner concluded at the September inquest into schoolgirl Molly Russell’s death that the teenager died by ‘an act of self-harm’ after suffering from ‘negative effects of online content’
Speaking to Emma Barnett on Woman’s Hour, Merry Varney, who represented the Russell family at the inquest says she trawled through thousands of posts for the case – and had to seek ‘professional assistance’ to help her deal with the ‘daily drip feed of hopelessness’ that Molly was exposed to
In an interview that listeners described as ‘powerful and painful’, Varney told the BBC Radio 4 programme that she was a ‘resilient adult and experienced inquest lawyer’ but said she had been haunted by the music which accompanied many of the ‘hopeless’ posts.
Emotional, she said the Russell family had ‘shone a very bright light in a very dark space and we all owe them a gratitude for that’.
Varney told the programme’s host Emma Barnett: ‘It’s the first time that I’ve taken professional assistance. I’m an experienced inquest lawyer. I work a lot with bereaved families with some very difficult material but this was something else.’
Ian Russell thanked Varney is his statement at the inquest’s end, saying without the work of the legal team ‘we would not know how harmful our unregulated online world is’
She said clicking on links from ‘a vast spreadsheet’ provided by Meta, which owns Instagram, Facebook and Whatsapp, saw her exposed to social media posts and memes that would be ‘telling the reader that they’re worthless that you’re ugly, that you have to put on a smile to tell the world you’re fine when you’re not.’
Molly’s father, Ian Russell, described the content his daughter was exposed to was a ‘daily drip feed of hopelessness’.
Varney said she had guidance on how to get through the case, saying: ‘I took advice about how to protect myself. I’m a very resilient person, but the music from the videos on Instagram, they invade your thoughts.
‘A consultant psychiatrist who gave evidence to the court in his role as an expert, described how he couldn’t sleep for weeks afterwards.
‘It keeps sucking you deeper, I could feel it happening to myself and I’m a resilient adult. The idea of a 14-year-old at that and children still having access to this material is at time overwhelmingly sad.’
Varney told the radio programme the Russell family had ‘shone a very bright light in a very dark space and we all owe them a gratitude for that’
Molly’s father Ian, left, is pictured alongside Merry Varney, far right. After the inquest, he slammed Facebook owner Meta for its ‘toxic corporate culture at the heart’ of the company
The inquest’s landmark ruling could have major implications for Big Tech, Meta and Pinterest were slammed for allowing ‘unsafe’ content ‘romanticising’ self-harm that fuelled the 14-year-old’s depression.
It painted a grim portrait of the lonely and toxic ‘ghetto of the online world’, including the heartbreaking revelation that Molly turned to celebrities including JK Rowling, US actress Lili Reinhart and YouTuber Salice Rose for help not realising there was little chance they would ever see her messages.
Speaking at the press conference held after the ruling, Molly’s brave father, Ian, urged those responsible to make the online world ‘a place that prioritises safety and wellbeing of young people over the money that can be made from them’.
He added: ‘This landmark conclusion has only been possible thanks to the extraordinary work of a team of people.
‘Firstly, I’d like to thank Senior Coroner Andrew Walker for his determination to learn what we can from Molly’s tragedy.
‘The Russell family’s legal team have spent too many long hours examining thousands of pages of disturbing evidence. Without Merry Varney, our solicitor from Leigh Day, and paralegal Caleb Bawdon, Jessica Elliott, our barrister, and Oliver Sanders, our KC, and a host of others working hard to build this case, we would not know how harmful our unregulated online world is.
‘I hope the data gathered may prove useful beyond this courtroom and continue to help create a safer web.
‘The support we have received from family, friends and supporters has been vital to us. Thank you to the press who have so sensitively covered this story so that the lessons we’ve learned have been reported responsibly and hopefully they will travel far.
‘For Molly’s sake… let’s make the online world a place that prioritises the safety and wellbeing of young people over the money that can be made from them. And for anyone struggling I will say again, please reach out to real people who can help. Please don’t forget, there’s always hope.’
Molly’s father added: ‘I’ve heard stories that some tech company platforms don’t even allow their children on their own platforms. I don’t know how true they are. But that may be the case. But whether it’s their children or anyone else’s child, if you allow a young person on the platform, that platform has to be safe for children to use.’
With tears welling up and his voice breaking, Molly’s father Ian Russell said as he concluded a press conference after the inquest: ‘Thank you Molly, for being my daughter. Thank you.’
He previously thanked the coroner, the press, and Molly’s friends and family for enabling her story to be told.
‘We shouldn’t be sitting here this is a story about one person, but that person has affected one family and their friends and maybe the wider world in some way.
‘We should not be sitting here. This should not happen because it does not need to happen.
The family’s lawyer described Molly’s social media as a ‘ghetto’ due to the disturbing nature of many of the posts she had liked
‘We told this story in the hope that change would come about.’
Mr Russell said a ‘monster’ has been created whereby social media products are not safe for users.
He added his message to Mark Zuckerberg would be a ‘simple’ one, saying: ‘A simple message to Mark would be just to listen. Listen to the people that use his platform, listen to the conclusions the coroner gave at this inquest, and then do something about it.’
How is social media regulated and how could laws now change?
Currently, most social media and search engine platforms that operate in the UK are not subject to any large-scale regulations specifically concerning user safety beyond a handful of laws that refer to the sending of threatening or indecent electronic communications.
Instead, these platforms are relied upon to self-regulate, using a mixture of human moderators and artificial intelligence to find and take down illegal or harmful material proactively or when users report it to them. Platforms lay out what types of content are and are not allowed on their sites in their terms of service and community guidelines, which are regularly updated to reflect on the evolving themes and trends that appear in the rapidly moving digital world.
However, critics say this system is flawed for a number of reasons, including that what is and is not regarded as safe or acceptable online can vary widely from site to site, and many moderation systems struggle to keep up with the vast amounts of content being posted.
Concerns have also been raised about the workings of algorithms used to serve users with content a platform thinks might interest them – often this is based on a user’s habits on the site and can mean that someone who searches for material linked to depression or self-harm could be shown more of it in the future. In addition, some platforms argue that certain types of content which are not illegal – but could be considered offensive or potentially harmful by some – should be allowed to remain online to protect free speech and expression.
As a result, large amounts of harmful content can be found on social media today as platforms struggle with moderating the sheer scale of content being posted and the balancing act of allowing users to express themselves while trying to keep their online spaces safe.
During the inquest, evidence given by executives from both Meta and Pinterest highlighted these issues. Pinterest executive Judson Hoffman admitted the platform was ‘not safe’ when Molly accessed it in 2017 because it did not have in place the technology it has now.
And Meta executive Elizabeth Lagone’s evidence highlighted the issue of understanding the context of certain posts when she said some of the content seen by Molly was ‘safe’ or ‘nuanced and complicated’, arguing that in some instances it was ‘important’ to give people a voice if they were expressing suicidal thoughts.
During the inquest, coroner Andrew Walker said the opportunity to make social media safe must not ‘slip away’, as he voiced concerns about the platforms. He outlined a range of concerns including a lack of separation of children and adults on social media; age verification and the type of content available and recommended by algorithms to children; and insufficient parental oversight for under-18s.
The UK’s plan to change this landscape is the Online Safety Bill, which would for the first time compel platforms to protect users from online harm, particularly children, by requiring them to take down illegal and other harmful content, and is due to be reintroduced to Parliament soon.
Companies in scope will be required to spell out clearly in their terms of service what content they consider to be acceptable and how they plan to prevent harmful material from being seen by their users. It is also expected to require firms to be more transparent about how their algorithms work and to set out clearly how younger users will be protected from harm.
The new regulations will be overseen by Ofcom and those found to breach the rules could face large fines or be blocked in the UK. The conclusion of the inquest into Molly’s death is expected to see renewed calls for the new rules to be swiftly introduced.
Following the inquest conclusion, NSPCC chief executive Sir Peter Wanless condemned Meta and Pinterest’s ‘abject failure’ to protect Molly and said the ruling should ‘send shockwaves through Silicon Valley’.
Meanwhile, Children’s Commissioner Dame Rachel de Souza called for social media giants to face tougher regulation, and demanded they ‘get a moral compass and step up’.
Concluding it would not be ‘safe’ to rule Molly’s cause of death was suicide, senior coroner Andrew Walker said the teenager ‘died from an act of self-harm while suffering depression and the negative effects of online content’.
In a conclusion at North London Coroner’s Court, Mr Walker said: ‘At the time that these sites were viewed by Molly, some of these sites were not safe as they allowed access to adult content that should not have been available for a 14-year-old child to see.
‘The way that the platforms operated meant that Molly had access to images, video clips and text concerning or concerned with self-harm, suicide or that were otherwise negative or depressing in nature.
For confidential support call the Samaritans on 116123 or visit a local Samaritans branch, see www.samaritans.org for details.