A Scottish mother who lost her 13-year-old daughter to suicide has backed a call to give children a voice in online safety regulation.
Nurse Ruth Moss, whose daughter Sophie Parkinson took her own life in March 2014, joined campaigners and other bereaved parents in a demand for the Online Safety Bill to include an independent advocacy body to protect children’s interests.
The bill is aiming to crack down on harmful online content as well as impose new legal requirements on tech industry giants.
It is due to be debated by the House of Lords next week, but children's charity NSPCC believes the amendment would be crucial in ensuring a voice for the youngest internet users.
The charity said the advocates would be able to provide a counterbalance to the powerful lobbying of large tech companies.
Ms Moss was among 1,000 NSPCC supporters who personally wrote to the UK Government ministers in charge of the legislation.
Written by survivors of online abuse and harm, parents, grandparents, frontline practitioners and concerned members of the public, the responses show there is deep support for the call.
“Someone needs to be legally representing children, to ensure that in future, they have a voice, and that harm is prevented," Ms Moss said.
READ MORE: Lennie Pennie: 'Women are lesser and should be silenced' - Time to clean up internet
Her 13-year-old daughter had accessed websites about self-harm and suicide before her death.
Ms Moss added: “The internet is a fast-moving, ever-changing environment. Children and parents cannot be expected to keep up with the latest internet risks, as effectively as an expert children’s advocacy organisation could.
“A children’s advocacy organisation would be able to concentrate on the processes and safety design of tech platforms, identifying risky design features and problems before they happen.”
YouGov polling, which took place at the end of April, showed that nine out of ten of the 153 surveyed Scots wanted an independent advocacy body to be created with an amendment.
A vast majority also said it was necessary that Ofcom listens to the opinions and experiences of children in its role as a social media regulator.
NSPCC chief executive Sir Peter Wanless said a child online safety advocate would also ensure an early warning system.
He said: “The Government’s Online Safety Bill will bring in much-needed regulation, but it has been contested by an industry for which children’s safety is too often an afterthought.
“Ofcom will become regulator with child sexual abuse taking place at record levels online and children still being bombarded with suicide content and misogynistic hate driven by aggressive algorithms.
“Despite this, some companies will be resistant to change their business models and Ofcom would benefit from expert support to help clean up decades-worth of harm that is the result of failed self-regulation in the tech sector.
“A statutory child online safety advocate will be crucial for successful regulation. It will give a powerful voice to the experiences of children and act as an early warning system that embeds a focus on prevention into decision making.”
READ MORE: Encrypted messaging services sign open letter against Online Safety Bill
The bill will have an effect across the whole of the UK and will provide legislative framework for the regulation of providers of user-to-user internet platforms such as Facebook and Twitter.
Barnardo’s, Young Minds, 5Rights and the Molly Rose Foundation and Breck Foundation, founded by bereaved parents Ian Russell and Lorin LaFave, have also strongly urged the UK Government to adopt the amendment.
Molly Rose Russell was 14 when she died by self-harm while suffering the negative effects of online content.
The Molly Rose Foundation was set up after her death in 2017 with the aim to prevent suicide among those aged under 25.
A spokesperson for the foundation, Andy Burrows, said: "The Molly Rose Foundation strongly supports this amendment as a crucial piece of the jigsaw to protect children from preventable online harm.
"If online safety regulation is to succeed, children need a strong, resourced and expert watchdog body that can protect their interests and that can hold tech companies to account and the regulator's feet to the fire."
Speaking about the YouGov polling of 1,723 UK adults, Mr Burrows added: "Clearly the public also feels this is a necessary step to give our children a seat at the table.”
When life is difficult, Samaritans are here – day or night, 365 days a year. You can call them for free on 116 123, email them at jo@samaritans.org, or visit www.samaritans.org to find your nearest branch.
Why are you making commenting on The Herald only available to subscribers?
It should have been a safe space for informed debate, somewhere for readers to discuss issues around the biggest stories of the day, but all too often the below the line comments on most websites have become bogged down by off-topic discussions and abuse.
heraldscotland.com is tackling this problem by allowing only subscribers to comment.
We are doing this to improve the experience for our loyal readers and we believe it will reduce the ability of trolls and troublemakers, who occasionally find their way onto our site, to abuse our journalists and readers. We also hope it will help the comments section fulfil its promise as a part of Scotland's conversation with itself.
We are lucky at The Herald. We are read by an informed, educated readership who can add their knowledge and insights to our stories.
That is invaluable.
We are making the subscriber-only change to support our valued readers, who tell us they don't want the site cluttered up with irrelevant comments, untruths and abuse.
In the past, the journalist’s job was to collect and distribute information to the audience. Technology means that readers can shape a discussion. We look forward to hearing from you on heraldscotland.com
Comments & Moderation
Readers’ comments: You are personally liable for the content of any comments you upload to this website, so please act responsibly. We do not pre-moderate or monitor readers’ comments appearing on our websites, but we do post-moderate in response to complaints we receive or otherwise when a potential problem comes to our attention. You can make a complaint by using the ‘report this post’ link . We may then apply our discretion under the user terms to amend or delete comments.
Post moderation is undertaken full-time 9am-6pm on weekdays, and on a part-time basis outwith those hours.
Read the rules here