An advocate for major social media platforms told an Australian Senate committee that laws to ban children younger than 16 from the sites should be delayed until next year at least instead of being rushed through the Parliament this week.
Sunita Bose, managing director of Digital Industry Group Inc, an advocate for the digital industry in Australia including X, Instagram, Facebook and TikTok, was answering questions at a single-day Senate committee hearing into world-first legislation that was introduced into the Parliament last week.
Ms Bose said the Parliament should wait until the government-commissioned evaluation of age assurance technologies is completed in June.
“Parliament is asked to pass a bill this week without knowing how it will work,” she said.
The legislation would impose fines of up to 50 million Australian dollars (£25.8 million) on platforms for systemic failures to prevent young children from holding accounts.
It seems likely to be passed by the Australian Parliament by Thursday with the support of the major parties.
It would take effect a year after the bill becomes law, allowing the platforms time to work out technological solutions that would also protect users’ privacy
Ms Bose received heated questions from several senators and challenges to the accuracy of her answers.
Opposition senator Ross Cadell asked how his 10-year-old stepson was able to hold Instagram, Snapchat and YouTube accounts from the age of eight, despite the platforms setting a nominal age limit of 13.
Ms Bose replied: “This is an area where the industry needs to improve.”
She said the proposed social media ban risked isolating some children and driving children to “darker, less safe online spaces” than mainstream platforms.
Ms Bose said her concern with the proposed law was that “this could compromise the safety of young people”, prompting a hostile response from opposition senator Sarah Henderson.
“That’s an outrageous statement. You’re trying to protect the big tech giants,” Ms Henderson said.
Unaligned senator Jacqui Lambie asked why the platforms did not use their algorithms to prevent harmful material from being directed to children.
The algorithms have been accused of keeping technology-addicted children connected to platforms and of flooding users with harmful material that promotes suicide and eating disorders.
“Your platforms have the ability to do that. The only thing that’s stopping them is themselves and their greed,” Ms Lambie said.
Ms Bose said algorithms were already in place to protect young people online through functions including filtering out nudity.
“We need to see continued investment in algorithms and ensuring that they do a better job at addressing harmful content,” she said.
Why are you making commenting on The Herald only available to subscribers?
It should have been a safe space for informed debate, somewhere for readers to discuss issues around the biggest stories of the day, but all too often the below the line comments on most websites have become bogged down by off-topic discussions and abuse.
heraldscotland.com is tackling this problem by allowing only subscribers to comment.
We are doing this to improve the experience for our loyal readers and we believe it will reduce the ability of trolls and troublemakers, who occasionally find their way onto our site, to abuse our journalists and readers. We also hope it will help the comments section fulfil its promise as a part of Scotland's conversation with itself.
We are lucky at The Herald. We are read by an informed, educated readership who can add their knowledge and insights to our stories.
That is invaluable.
We are making the subscriber-only change to support our valued readers, who tell us they don't want the site cluttered up with irrelevant comments, untruths and abuse.
In the past, the journalist’s job was to collect and distribute information to the audience. Technology means that readers can shape a discussion. We look forward to hearing from you on heraldscotland.com
Comments & Moderation
Readers’ comments: You are personally liable for the content of any comments you upload to this website, so please act responsibly. We do not pre-moderate or monitor readers’ comments appearing on our websites, but we do post-moderate in response to complaints we receive or otherwise when a potential problem comes to our attention. You can make a complaint by using the ‘report this post’ link . We may then apply our discretion under the user terms to amend or delete comments.
Post moderation is undertaken full-time 9am-6pm on weekdays, and on a part-time basis outwith those hours.
Read the rules here