The biggest social media platforms will be required to protect children online by keeping them off suggested friend lists to stop them being contacted by groomers, Ofcom has said.
The new online safety regulator has published its first draft codes of practice under the Online Safety Act, which was signed into law last week.
The first codes focus on illegal material online – such as child sexual abuse material, grooming content and fraud.
Under the code, the largest platforms will be required, by default, to ensure that children on their sites are not presented with lists of suggested friends, do not appear in other users’ lists, that their location information is not visible to other users and that people outside their agreed connections cannot direct message them.
Ofcom is set to publish further codes in the coming months on other areas of online safety, such as guidance for adult sites on keeping children away and on protecting children from harmful content promoting things such as suicide or self-harm.
Each of the draft codes will have a consultation period before requiring final approval from Parliament.
The regulator’s own timetable says it hopes to begin enforcing its first codes of practice by the end of 2024.
The illegal content code also encourages larger sites to use hash matching technology to identify illegal images of sexual abuse and use automated tools to detect websites that have been identified as hosting abuse material.
On fighting fraud and terrorism, Ofcom says services should use automatic detection systems to find and remove posts linked to the sale of stolen financial information and block all accounts run by proscribed terrorist organisations.
The codes of practice also propose that tech firms nominate an accountable person who reports to senior management on compliance around illegal content, reporting and complaints duties, ensure their content moderation teams are well resourced and trained, offer easy reporting and blocking tools to use, and carry out safety tests on recommendation algorithms.
Dame Melanie said: “Regulation is here, and we’re wasting no time in setting out how we expect tech firms to protect people from illegal harm online, while upholding freedom of expression.
“Children have told us about the dangers they face, and we’re determined to create a safer life online for young people in particular.”
Writing in the Daily Telegraph, she said Ofcom “cannot waste a moment” in putting its powers to use, adding: “Children are our first priority, and the risk they face is real.
Technology Secretary Michelle Donelan said the publication of the first codes marked a “crucial” step in making the Online Safety Act a reality by “cleaning up the wild west of social media and making the UK the safest place in the world to be online”.
“Before the Bill became law, we worked with Ofcom to make sure they could act swiftly to tackle the most harmful illegal content first,” she said.
“By working with companies to set out how they can comply with these duties, the first of their kind anywhere in the world, the process of implementation starts today.”
Ofcom said it had been and would continue working with social media and other in scope platforms over the coming months to help ensure they were in compliance with the proposed codes when they come into full force.
Campaign groups have backed the first proposals from the regulator.
Susie Hargreaves, chief executive of the Internet Watch Foundation, said: “We stand ready to work with Ofcom, and with companies looking to do the right thing to comply with the new laws.
“It’s right that protecting children and ensuring the spread of child sexual abuse imagery is stopped is top of the agenda.
“It’s vital companies are proactive in assessing and understanding the potential risks on their platforms, and taking steps to make sure safety is designed in.
“Making the internet safer does not end with this Bill becoming an Act. The scale of child sexual abuse, and the harms children are exposed to online, have escalated in the years this legislation has been going through Parliament.
“Companies in scope of the regulations now have a huge opportunity to be part of a real step forward in terms of child safety.”
Why are you making commenting on The Herald only available to subscribers?
It should have been a safe space for informed debate, somewhere for readers to discuss issues around the biggest stories of the day, but all too often the below the line comments on most websites have become bogged down by off-topic discussions and abuse.
heraldscotland.com is tackling this problem by allowing only subscribers to comment.
We are doing this to improve the experience for our loyal readers and we believe it will reduce the ability of trolls and troublemakers, who occasionally find their way onto our site, to abuse our journalists and readers. We also hope it will help the comments section fulfil its promise as a part of Scotland's conversation with itself.
We are lucky at The Herald. We are read by an informed, educated readership who can add their knowledge and insights to our stories.
That is invaluable.
We are making the subscriber-only change to support our valued readers, who tell us they don't want the site cluttered up with irrelevant comments, untruths and abuse.
In the past, the journalist’s job was to collect and distribute information to the audience. Technology means that readers can shape a discussion. We look forward to hearing from you on heraldscotland.com
Comments & Moderation
Readers’ comments: You are personally liable for the content of any comments you upload to this website, so please act responsibly. We do not pre-moderate or monitor readers’ comments appearing on our websites, but we do post-moderate in response to complaints we receive or otherwise when a potential problem comes to our attention. You can make a complaint by using the ‘report this post’ link . We may then apply our discretion under the user terms to amend or delete comments.
Post moderation is undertaken full-time 9am-6pm on weekdays, and on a part-time basis outwith those hours.
Read the rules hereLast Updated:
Report this comment Cancel