Child safety campaigner Ian Russell has warned that children should not have to pay the price for technology companies failure to make the necessary changes to protect them from harmful content.
Mr Russells 14-year old daughter Mollytook her own life in November 2017after viewing suicide and self-harm content on Instagram, leading him to say the app had helped kill my daughter.
He branded tech companies a disgrace for what he sees as a repeated failure to take action to protect children from damaging content.
The freshest exclusives and sharpest analysis, curated for your inbox
Its not the kids fault that this content is there and its not kids that should have to pay the price for it, Mr Russell told i.
As the years tick past the more that belief seems to be confirmed, sadly, that there is a reluctance for big platforms to change simply because they used to doing things a certain way.
Maybe thats not a surprise, but when it so profoundly and detrimentally affects young peoples lives as digital technology can I dont think its really ever intended to, but it does and theyre powerful and rich and they dont take sufficient steps quickly enough to deal with it, then I think its just a disgrace.
Mr Russell, who founded suicide prevention charity the Molly Rose Foundation, is working with child online safety group 5Rights Foundation on new campaign Twisted Toys to highlight the stark differences between what is not accepted offline and what is allowed to happen online.
The campaign includes parody videos including surveillance camera-equipped teddy bear Share Bear that collects childrens data and the Stalkie Talkie, a walkie talkie that connects children to random adults to demonstrate how unacceptable online dangers would be in a physical toy.
Mr Russell recalled the horrible dawning he experienced in the weeks following Mollys death about the dangers of the online world, compared to the comparative protections in place in their offline lives.
When we discovered what Molly had been seeing and liking and viewing online, despite being the youngest of three daughters, growing up in a house when we talked about e-safety and all those things that people do to protect their children, we were shocked and horrified, he said.
I dont think we were naive enough to suspect the internet didnt contain such horrors, but we didnt realise they were so widely and easily available. We didnt realise that the platforms were pushing them algorithmically to children, and even sending emails to connect Molly to other harmful content.
We didnt think that those companies could behave like that because its just so illegal and immoral, and yet theres nothing to stop them. And they did, and in some case, still do behave like that.
Research conducted by 5Rights found that 90 per cent of 982 parents surveyed said they thought the internet could be harmful to children, while 80 per cent said they did not trust tech companies to protect young people online.
An additional 71 per cent said they thought the Government could be doing more to ensure child safety on the internet.
Technology and social media companies should adopt a mandatory safety-by-design approach when building or running anything that could affect a child, Baroness Beeban Kidron, crossbench peer in the UK House of Lords and chair of the 5Rights Foundation.
I think we are at a last resort, she said. This should have been the last resort a decade ago. This shouldnt be happening to children and we must not allow or accept it.
The work of Baroness Kidron and her peers, including the development of the Age Appropriate Design Code, informs the expansive work we do every day to protect the safety and privacy of young people using our apps, a Facebook spokesperson said.
Thousands of parents work at Facebook, and we feel a collective responsibility to make sure that young people can enjoy all the benefits of our apps while protecting them from harm.
While Instagram, which is owned by Facebook, made changes to its community guidelines after details of Mollys death were made public in 2019, includingbanning graphic self-harming images and videoand adding sensitivity screens to blur images, Mr Russell has previouslt said there is still too much harmful content available on the platform.
Some of the platforms have made attempts to remove harmful content, but its still there and still too easy to find. I accept its a very difficult task, but Im sure there must be something more that could be done, he told i in March.
Here is the original post: