Paul Stanfield doesn’t need to see the worst of humanity for it to have an impact on him every day of his working life.
“You just need to see the file names, see how explicit they are,” said the CEO of Childlight, the world’s first child safety data institute, from the organisation’s HQ on Edinburgh’s Royal Mile. “That’s enough to have an impact on you.”
While tourists from around the globe throng the most famous street in Scotland’s capital, Stanfield and his team sit in an office three floors up, plotting their formidable mission to protect children around the globe from the growing plague of online sex abuse.
And it is an increasingly formidable mission.
Last week, Childlight published research which found that 300 million children were abused online last year alone. The numbers are almost incomprehensible. By way of comparison, there have been seven million deaths from Covid since the beginning of the pandemic four years ago.
And the comparison is deliberate. For Stanfield and his team, child sexual abuse should be thought of in similar terms as a virus sweeping the world.
“I spent 30 years in law enforcement, across the world, working for Interpol and the UK National Crime Agency, and working in places like Africa,” said the former Regional Director for the UK’s National Crime Agency in Africa, and Director of Global Organised Crime at Interpol.
“So I know that this is an issue that can’t be dealt with by law enforcement alone. It now needs to be treated as a global health issue.
“It’s a hidden global pandemic, and as such we need to shine a light on the hidden reality of what is happening throughout the world. Childlight is for safeguarding children from sexual exploitation and abuse globally. That’s what we’re about.”
According to childlight’s data research, one case is reported every second around the world, with one in eight children at risk.
The study’s authors say there are victims in every classroom, in every school in every country.
In the UK alone, the number of offenders would fill Wembley Stadium - capacity 90,000 - 20 times over.
Measurable online offences include online solicitation, like unwanted sexual talk, questions and requests for sexual acts. It also includes so-called ‘sextortion’, where predators demand money from victims to keep images private, and the use of artificially generated AI deepfake technology, recently used to generate false sexual images of Taylor Swift.
“During Covid it grew exponentially,” said Stanfield. “People were sat at home on technology. And this sort of abuse is enabled by technology.”
Stanfield was compelled to set up Childlight after working around the world with Interpol and the UK NCA, as well as working conversations with Childlight co-founder Dr John Climax of the Human Dignity Foundation.
“He recognised that when you treat something like AIDS as a global health issue, then you see what can be done when people come together to address a problem of that size.
“He was involved in getting the first antivirals to market in the 1980s. Then during Covid he was involved in getting the vaccine out at rapid pace. He said to me why can’t we do the same with child sexual exploitation and abuse. And that started the conversation.”
Despite, or more precisely because of, his background in policing at the highest level, Stanfield knew the obstacles which lay ahead, and not just in terms of the size of the task in hand.
“Having worked in serious and organised crime, I knew that before you can start tackling crime, you have to understand it, and to understand it you need access to good data,” he said.
Read more:
- Damian Barr: 'We all move through life carrying our past selves with us'
- The public picking apart of Kate Middleton reveals us at our ugliest
- Rowling's tweets on trans issue feel gratuitously cruel and are their own social ill
“From a law enforcement perspective, I wasn’t going to be able to articulate things in a way that governments would react to. They don’t listen to law enforcement, which doesn’t have a powerful voice, and is restricted in what it can say.
“So we wanted to work with an institution that had academic freedom to undertake research and peer-review it, so that we had something reliable and robust. Governments listen to research that comes out of academia, and especially an institution like Edinburgh University which has a history of working globally and is very focussed on data-driven technology. This seemed the natural partnership.
“When we got together with the university to have those discussions we felt we shared the same values. It’s a globally-recognised university, and the city has a pulling power. When we launched in March last year, people wanted to come to Edinburgh.”
Childlight is now a team of 30-odd internationally-sourced employees, each working on a global mission in one of Scotland’s most exalted locations, from where the message can be delivered loud and clear. The question is whether Big Tech, governments and businesses such as banks are willing to listen, and act.
The rolling out of end-to-end encryption on social media platforms has been criticised by the UK government, with UK home secretary James Cleverley criticising Meta last year over their plans. Child safety campaigners fear moves like this give offenders greater scope to hide their crimes.
Last month, First Minister John Swinney also pledged to do more to keep children safe after the suicide of Scottish teenager Murray Dowey, who took his life after falling prey to an online ‘sextortion’ scam. The 16 year old, from Dunblane, was tricked into sending intimate photos by criminals posing as a girl. Meta handed over data relating to Dowey’s accounts to Police Scotland following a US Department of Justice court order.
The force this year pointed to a sixfold increase in the number of such cases compared to a decade ago, with more than 2,000 cases in 12 months reported last April. There also say there was a 5% annual increase of sexual abuse cases across the board north of the border as of December.
Despite the introduction in 2022 of the Online Safety Act, which gives Ofcom increased powers over websites and compels the owners of online platforms to operate with a duty of care to users, Stanfield maintains governments must do more, including imposing stricter rules around age verification and file sharing.
He said: “No government in any country in the world is doing enough, in my view. Children are the most vulnerable people in society and we need to focus more on how we keep them safe. We need regulation. And there is no regulation online, it’s like the wild west. Protection of privacy is put before protection of the child. These things don’t need to be mutually exclusive, but piracy and profit are put over child safety.
“Social media companies have a real financial interest in the status quo. They have made our lives better in a lot of ways, but they are being used for bad things. Privacy and safety are not mutually exclusive.
“I believe that anyone providing a platform like that has a duty of care. A single piece of Lego has more safety regulation surrounding it than the internet does where children are concerned.”
Liverpudlian Stanfield started with the police in 1992, a period when English football was at its nadir due to fan violence and the destructive international reputation garnered by soccer casuals.
“After we had the violence in football hooliganism in the 80s and 90s there was a condition that if you wanted to go to a game you had to consent to being searched. If you refused, you don’t get in,” he said.
“We say there should be a condition that when you go online you must consent not to upload or access child abuse material and if you do you get kicked off and reported to authorities. But neither of these things happen.
“If I watch Sky Sports without licence or if I download music illegally I am likely to get a fine. So they can do it when there’a a commercial interest at stake. Yet if I upload or download child abuse material what’s happening? Very little.
“And changes to end-to-end encryption are privacy by design, not safety by design. It’s effectively saying, ‘If we don’t see it, we don’t have to report it, and if we don’t report it we don’t have people checking it.’
“So the number of referrals will go down, which doesn’t mean abuse is going down - it means someone has turned the lights down on it. Childlight is trying to turn the lights back on.”
Stanfield is also encouraging voters to shine a light on the leading parties in the upcoming general election, challenging them to do more.
Most of the major UK parties have made broad declarations of intent to make the online environment safer, with policies ranging from tougher age verification to tighter access to mobile phones and mental health tax on social media companies.
Stanfield said: “People are voting for new governments all around the world and we are asking people to demand action, hold their government to account and ask what they are doing to protect the most vulnerable. This should be at the top of the priority list.
“This is about a person who is sitting in the US, paying $30 to direct the rape of a child in the Phillipines, which is uploaded in Argentina and then spread around the world.
“In India, they are receiving 16,000 referrals from one data source every day. I speak to a lot of clever people who work in tech, and they tell me technology is available to stop it. It’s about the choice these companies are making.
“We are dealing with data where children are being sexually abused, raped, tortured every single second of every day. And therefore children can’t wait. We need to act now.”
For Stanfield, the prospect of affecting significant change on a global level drives him through his days dealing with the most sinister of crimes.
“The work we are doing has already resulted in children being safeguarded in some countries around the world,” he said. “I feel empowered by that. I feel I’m making a difference.”
To find out more about Childlight, visit: childlight.org
Why are you making commenting on The Herald only available to subscribers?
It should have been a safe space for informed debate, somewhere for readers to discuss issues around the biggest stories of the day, but all too often the below the line comments on most websites have become bogged down by off-topic discussions and abuse.
heraldscotland.com is tackling this problem by allowing only subscribers to comment.
We are doing this to improve the experience for our loyal readers and we believe it will reduce the ability of trolls and troublemakers, who occasionally find their way onto our site, to abuse our journalists and readers. We also hope it will help the comments section fulfil its promise as a part of Scotland's conversation with itself.
We are lucky at The Herald. We are read by an informed, educated readership who can add their knowledge and insights to our stories.
That is invaluable.
We are making the subscriber-only change to support our valued readers, who tell us they don't want the site cluttered up with irrelevant comments, untruths and abuse.
In the past, the journalist’s job was to collect and distribute information to the audience. Technology means that readers can shape a discussion. We look forward to hearing from you on heraldscotland.com
Comments & Moderation
Readers’ comments: You are personally liable for the content of any comments you upload to this website, so please act responsibly. We do not pre-moderate or monitor readers’ comments appearing on our websites, but we do post-moderate in response to complaints we receive or otherwise when a potential problem comes to our attention. You can make a complaint by using the ‘report this post’ link . We may then apply our discretion under the user terms to amend or delete comments.
Post moderation is undertaken full-time 9am-6pm on weekdays, and on a part-time basis outwith those hours.
Read the rules here