In an era of rapid digitalisation, organisations increasingly rely on technology to deliver services and interact with the public. While robust data protection measures are crucial, they alone are insufficient to address the complex challenges posed by our evolving digital landscape. This is especially true for the public sector, which delivers services that deeply impact our overall well-being.
As a digital sociologist with a decade of experience in creating tech ethics frameworks, I believe public organisations must prioritise ethics within their digital strategies to build trust, mitigate risks, and advance innovation. In a recent survey of 600 CEOs, 62% said they were willing to sidestep security and ethical concerns to keep up with tech innovation. This is both concerning and disappointing.
Many organisations, including government bodies, have yet to fully grasp the importance of ethics in their digital initiatives. They often focus solely on usability, efficiency and growth, overlooking potential negative impacts on individuals, communities, and the environment. This narrow focus can lead to consequences that erode public trust and undermine efforts to improve overall outcomes. The recent cyber attack on NHS Dumfries and Galloway, which resulted in the publication of sensitive patient data online, is an active example.
To address these challenges, public organisations should adopt a comprehensive ethical framework that goes beyond mere principles. This includes developing more holistic risk registers that consider not just cybersecurity and data protection, but also issues like job displacement, surveillance, and discrimination. Ethics can also be used to identify new opportunities and improve problem-solving through new insights and empowering teams to act more agilely.
Implementing such a framework requires actionable steps and cultural change, including creating tech ethics training, "how-to" guides for staff, and new design methods.
Public organisations must also acknowledge that technology never arrives in a regulatory vacuum. Even in the absence of AI-specific legislation, existing data protection, consumer rights, and competition laws apply. Proactively considering ethical implications can help organisations future-proof themselves against evolving regulations and stay ahead of potential legal challenges.
Here in the UK, there's a pressing need to incorporate ethics more explicitly into existing service standards. We need a principle that asks service teams to consider the ethical impacts of their work beyond just meeting user needs. In addition, the new Labour Government should reinstate bodies like the Centre for Data Ethics and Innovation and the Data Ethics Council to provide crucial guidance and oversight.
Lastly, public organisations must improve data sharing between departments to address complex social issues more effectively. This requires reimagining processes and adopting more agile governance structures so that policy making can better keep up with service delivery and technological change.
Leaders may be focused on what they can gain by moving quickly, but a more educated public are increasingly focused on what they stand to lose if technology isn't deployed safely and responsibly. We need to adopt a more balanced approach to tech innovation. All the tech in the world cannot restore trust once it's broken.
Lisa Talia Moretti is a Digital Sociologist at AND Digital
Agenda is a column for outside contributors. Contact: agenda@theherald.co.uk
Why are you making commenting on The Herald only available to subscribers?
It should have been a safe space for informed debate, somewhere for readers to discuss issues around the biggest stories of the day, but all too often the below the line comments on most websites have become bogged down by off-topic discussions and abuse.
heraldscotland.com is tackling this problem by allowing only subscribers to comment.
We are doing this to improve the experience for our loyal readers and we believe it will reduce the ability of trolls and troublemakers, who occasionally find their way onto our site, to abuse our journalists and readers. We also hope it will help the comments section fulfil its promise as a part of Scotland's conversation with itself.
We are lucky at The Herald. We are read by an informed, educated readership who can add their knowledge and insights to our stories.
That is invaluable.
We are making the subscriber-only change to support our valued readers, who tell us they don't want the site cluttered up with irrelevant comments, untruths and abuse.
In the past, the journalist’s job was to collect and distribute information to the audience. Technology means that readers can shape a discussion. We look forward to hearing from you on heraldscotland.com
Comments & Moderation
Readers’ comments: You are personally liable for the content of any comments you upload to this website, so please act responsibly. We do not pre-moderate or monitor readers’ comments appearing on our websites, but we do post-moderate in response to complaints we receive or otherwise when a potential problem comes to our attention. You can make a complaint by using the ‘report this post’ link . We may then apply our discretion under the user terms to amend or delete comments.
Post moderation is undertaken full-time 9am-6pm on weekdays, and on a part-time basis outwith those hours.
Read the rules here