While the world has been grappling with Covid-19, another pandemic has been spreading largely unseen: the sexual abuse of children online.
At any one time, an estimated 750,000 offenders are looking to connect with children online across the globe for sexual purposes. During lockdown, reports of child abuse material increased exponentially. So, it was good to see the Government and the NSPCC shining a spotlight this week on this other pandemic, and pointing the finger at the big tech companies who can and must do more to stop it.
I have seen all too clearly the torment this abuse can cause. In 2012, I was working at Facebook when five-year-old April Jones was murdered by a man who evidence later revealed had become addicted to child sexual abuse material online. I felt sick that we hadn’t done more to root out this content.
As a mother, it was an anguishing wake-up call. As a leader in the tech industry, I knew something had to change. Later as the UK’s first Minister for Internet Safety, I worked closely with the authorities and the charity sector and founded the WeProtect Global Alliance, an initiative working to protect children from abuse and to create a safer online world, which now counts 98 countries, 52 civil society organisations, and 51 private sector companies among its members.
Since then, I have seen massive progress in the understanding of this crime and our efforts to combat it. Yet the companies and platforms criminals use continue to spectacularly fail in their responsibilities to protect children online and, worse, put barriers in the way of governments and law enforcement agencies trying to stop it. The latest is the plan to employ end-to-end encryption for Facebook, Messenger and Instagram. This represents a dangerous step backwards, sacrificing the safety of our children on the altar of data privacy. It will “blind” tech firms and authorities to online child abuse. The Home Secretary and the NSPCC are right to call out the companies for failing to meet their moral responsibilities once again.
The Home Office has reported that, in the last year alone, US technology companies made 21.4 million referrals of child sexual abuse. In the UK, the National Crime Agency investigates these referrals and, as a result, it made more than 4,500 arrests and safeguarded about 6,000 children in the UK. However, if Facebook goes ahead with end-to-end encryption, the US National Centre for Missing and Exploited Children estimates there could be a loss of almost 12 million child abuse reports every year. That is millions of children whose cries will go unheard.
Encryption will hide the reality we face while handing a cloak of protection to sexual predators. It will enable them to attack without fear or consequence. While I recognise this technology is a vital tool to protect privacy and security, time and again, these issues have been prioritised above child safety.
As the Home Secretary said, this conversation has been derailed into an “either or” argument between adult privacy and child safety. As the NSPCC poll published this week reports, public support for end-to-end encryption would almost double if platforms could demonstrate children’s safety would not be compromised. It’s imperative the conversation centres around how we implement global regulation with real teeth and demand that safety is built directly into the design of these platforms. This will help to shift the focus from flagging and removing illegal content to preventing abusers from contacting potential victims in the first place.
We don’t need an “Instagram for kids”, which is good for Facebook’s bottom line but bad for children’s wellbeing. We need the tough regulation this Government is championing, a broader conversation about the harms embedded in this technology and for tech companies to finally put child protection before profit.
Baroness Shields OBE is CEO of BenevolentAI, a former UK Minister for Internet Safety and Security and founder of WeProtect