Slap web giants with tobacco-style health warnings if they put kids in danger, NSPCC blasts
The children's charity has demanded a tough legal crackdown on arrogant social media firms.
NSPCC bosses blasted that the tragic suicide of 14-year-old Molly Russell shows internet self-regulation has failed.
They warned it's "already too late" to save vulnerable children as they unveiled a new package of proposed laws drawn up with the help of a top City legal firm.
One of the demands would see social networks named and shamed when they let their users down.
If firms are found to have endangered children by giving them access to sexual or self-harm content, they would be obliged to put a prominent notice on their homepage admitting their failure.
The NSPCC's Andy Burrows told The Sun: "If platforms significantly breach the duty of care they have to children, then that is something that needs to be made public.
"That's something parents would want to be aware of so they can have a very immediate sense that a significant failing has taken place.
It's already too late – someone has lost a child
"That is a very powerful step that your users will see, that parents will see, and then that will raise questions about whether or not that platform is fundamentally safe.
"What the naming and shaming aspect would do is to make it clear when platforms are putting children at risk."
He added: "The entire country has been horrified by what happened to Molly Russell – but we know there are hundreds if not thousands of children who come to harm every year because of the content that they see on these sites."
Molly, 14, took her own life after being exposed to a flood of graphic suicide and self-harm pictures on Instagram.
Last week, the firm – owned by Facebook – announced it would ban the horrific images.
But the NSPCC today called for an official regulator with the power to impose multi-million-pound fines and ban executives from serving in boardrooms.
Social networks would have a legal responsibility to make their sites and apps safe for children – and keep them constantly updated whenever they introduce new features.
And the regulator would have the power to demand information from tech firms when they're suspected of endangering users.
NSPCC chief Peter Wanless hit out at ministers who've repeatedly promised to take action but failed to pass new laws.
He said: "We are fed up with warm words and good intentions.
"It's already too late – someone has lost a child."
The recommendations were welcomed by Ruth Moss, whose daughter Sophie took her own life aged just 13 after looking up graphic images online.
Mrs Moss said today: "Sophie’s death devastated me. No mother, or family, should have to go through that.
"The protection of our children is too important to leave to the goodwill of large, profit-orientated organisations. Statutory regulation is needed and as a matter of urgency."
She banned Sophie from having a smartphone, imposed tough curbs on her web use and even called the police after discovering the teen had been looking up horrific content.
But the young girl was able to carry on surfing the internet anyway despite her parents' efforts to keep her safe, Mrs Moss said.
She warned: "It's really hard for parents to do this alone."
Relatives of Molly Russell welcomed the NSPCC's call for a legal regulator.
A spokesman for the Molly Rose Foundation, set up in her memory, said: "Molly’s story has prompted a huge change in public opinion.
"Many are now aware of how social media companies are failing in their duty of care when it comes to protecting our children."
A poll commissioned by the NSPCC found 92 per cent of parents believe their should be official regulations imposed on social networks.
- If you need someone to talk to the Samaritans are free to call on 116 123, or call CALM on 0800 58 58 58.