Child safety campaigner Ian Russell has warned that children should not have to pay the price for technology companies’ failure to make the necessary changes to protect them from harmful content.
Mr Russell’s 14-year old daughter Molly took her own life in November 2017 after viewing suicide and self-harm content on Instagram, leading him to say the app had “helped kill my daughter”.
He branded the tech companies “a disgrace” for what he sees as a repeated failure to take action to protect children from damaging content.
“It’s not the kids’ fault that this content is there and it’s not kids that should have to pay the price for it,” Mr Russell told i.
“As the years tick past the more that belief seems to be confirmed, sadly, that there is a reluctance for big platforms to change simply because they used to doing things a certain way.
“Maybe that’s not a surprise, but when it so profoundly and detrimentally affects young people’s lives as digital technology can – I don’t think it’s really ever intended to, but it does – and they’re powerful and rich and they don’t take sufficient steps quickly enough to deal with it, then I think it’s just a disgrace.”
Mr Russell, who founded suicide prevention charity the Molly Rose Foundation, is working with child online safety group 5Rights Foundation on new campaign Twisted Toys to highlight the stark differences between what is not accepted offline and what is allowed to happen online.
The campaign includes parody videos including surveillance camera-equipped teddy bear ‘Share Bear’ that collects childrens’ data and the ‘Stalkie Talkie’, a walkie talkie that connects children to random adults to demonstrate how unacceptable online dangers would be in a physical toy.
Mr Russell recalled the “horrible dawning” he experienced in the weeks following Molly’s death about the dangers of the online world, compared to the comparative protections in place in their offline lives.
“When we discovered what Molly had been seeing and liking and viewing online, despite being the youngest of three daughters, growing up in a house when we talked about e-safety and all those things that people do to protect their children, we were shocked and horrified,” he said.
“I don’t think we were naive enough to suspect the internet didn’t contain such horrors, but we didn’t realise they were so widely and easily available. We didn’t realise that the platforms were pushing them algorithmically to children, and even sending emails to connect Molly to other harmful content.
“We didn’t think that those companies could behave like that because it’s just so illegal and immoral, and yet there’s nothing to stop them. And they did, and in some case, still do behave like that.”
Research conducted by 5Rights found that 90 per cent of 982 parents surveyed said they thought the internet could be harmful to children, while 80 per cent said they did not trust tech companies to protect young people online.
An additional 71 per cent said they thought the Government could be doing more to ensure child safety on the internet.
Technology and social media companies should adopt a mandatory safety-by-design approach when building or running anything that could affect a child, Baroness Beeban Kidron, crossbench peer in the UK House of Lords and chair of the 5Rights Foundation.
“I think we are at a last resort,” she said. “This should have been the last resort a decade ago. This shouldn’t be happening to children and we must not allow or accept it.”