Internet sites which host harmful material such as terrorist propaganda, child abuse, revenge porn and fake news could be fined or blocked under a new crackdown.
Under the government plans an independent watchdog would be set up to write a code of practice for internet companies such as Facebook, Twitter and Google.
The rules would also apply to messaging services such as Snapchat and cloud storage services.
The proposals could see senior managers held liable for breaches with the scheme being paid for by a levy on the industry.
However, critics have argued the plans to curb information threaten freedom of speech.
Read more: Internet giants join forces to tackle extremism
The Online Harms White Paper is a joint proposal from the Department for Digital, Culture, Media and Sport (DCMS) and the Home Office. A public consultation on the plans will run for 12 weeks.
The paper suggests a raft of proposals, including the ability to fine companies that break the rules.
In addition, the scheme would see additional enforcement powers such as the ability to fine company executives and force internet service providers to block sites that break the rules.
Outlining the proposals, Culture Secretary Jeremy Wright said: "The era of self-regulation for online companies is over.
"Voluntary actions from industry to tackle online harms have not been applied consistently or gone far enough."
The plans cover a range of issues such as spreading terrorist content, child sex abuse, so-called revenge pornography, hate crimes, harassment and the sale of illegal goods.
It also covers harmful behaviour that has a less clear legal definition such as cyber-bullying, trolling and the spread of fake news and disinformation.
It says social networks must tackle material that advocates self-harm and suicide, which became a prominent issue after 14-year-old Molly Russell took her own life in 2017.
Read more: Instagram 'helped kill my daughter'
After she died her family found distressing material about depression and suicide on her Instagram account with her father holding the social media giant partly responsible for her death.
Home Secretary Sajid Javid said tech giants and social media companies had a moral duty "to protect the young people they profit from".
"Despite our repeated calls to action, harmful and illegal content - including child abuse and terrorism - is still too readily available online," he said.
The regulator will have the power to fine companies and publish notices naming and shaming those that break the rules.
The government says it is also considering fines for individual company executives and making search engines remove links to offending websites.
The children's charity NSPCC has been urging new regulation since 2017 and has repeatedly called for a legal duty of care to be placed on social networks.
A spokeswoman said: "Time's up for the social networks. They've failed to police themselves and our children have paid the price."
Rebecca Stimson, Facebook's head of UK policy, said: "New regulations are needed so that we have a standardised approach across platforms and private companies aren't making so many important decisions alone.
"New rules for the internet should protect society from harm while also supporting innovation, the digital economy and freedom of speech."
Twitter's head of UK public policy Katy Minshall said: "We look forward to engaging in the next steps of the process, and working to strike an appropriate balance between keeping users safe and preserving the open, free nature of the internet."
TechUK, an umbrella group representing the UK's technology industry, said the government must be "clear about how trade-offs are balanced between harm prevention and fundamental rights".
However, Matthew Lesh, head of research at free market think tank the Adam Smith Institute, warned of censorship.
He said: "The government should be ashamed of themselves for leading the western world in internet censorship.
"The proposals are a historic attack on freedom of speech and the free press.
"At a time when Britain is criticising violations of freedom of expression in states like Iran, China and Russia, we should not be undermining our freedom at home."
Read more: Scottish pupils to be taught cyber dangers
Freedom of speech campaigners Article 19 warned the government "must not create an environment that encourages the censorship of legitimate expression".
A spokesman said it opposed any duty of care being imposed on internet platforms.
They said that would "inevitably require them to proactively monitor their networks and take a restrictive approach to content removal".
"Such actions could violate individuals' rights to freedom of expression and privacy," they added.
Why are you making commenting on The Herald only available to subscribers?
It should have been a safe space for informed debate, somewhere for readers to discuss issues around the biggest stories of the day, but all too often the below the line comments on most websites have become bogged down by off-topic discussions and abuse.
heraldscotland.com is tackling this problem by allowing only subscribers to comment.
We are doing this to improve the experience for our loyal readers and we believe it will reduce the ability of trolls and troublemakers, who occasionally find their way onto our site, to abuse our journalists and readers. We also hope it will help the comments section fulfil its promise as a part of Scotland's conversation with itself.
We are lucky at The Herald. We are read by an informed, educated readership who can add their knowledge and insights to our stories.
That is invaluable.
We are making the subscriber-only change to support our valued readers, who tell us they don't want the site cluttered up with irrelevant comments, untruths and abuse.
In the past, the journalist’s job was to collect and distribute information to the audience. Technology means that readers can shape a discussion. We look forward to hearing from you on heraldscotland.com
Comments & Moderation
Readers’ comments: You are personally liable for the content of any comments you upload to this website, so please act responsibly. We do not pre-moderate or monitor readers’ comments appearing on our websites, but we do post-moderate in response to complaints we receive or otherwise when a potential problem comes to our attention. You can make a complaint by using the ‘report this post’ link . We may then apply our discretion under the user terms to amend or delete comments.
Post moderation is undertaken full-time 9am-6pm on weekdays, and on a part-time basis outwith those hours.
Read the rules here