End-to-end encryption is a familiar concept in everyday life. Anyone who uses WhatsApp uses end-to-end encryption every time they send/receive messages. The way it works is that no one monitoring the network can see the content of your messages, not the government or even the company that facilitates your communication (like WhatsApp).

Whilst this is a very important privacy feature, a challenge presented by end-to-end encryption is that it can undermine safety features that companies, and the police, use to detect child sexual abuse material, whether this is grooming or the sharing of indecent images of children.

Last year, the Internet Watch Foundation found that there was a ‘three-fold’ increase in online child sexual abuse. The government have been working on tackling this serious issue with the now delayed Online Safety Bill.

In response to this pressing issue, Dr Ian Levy and Crispin Robinson of GCHQ have written a paper offering guidance to tech companies on how to protect children from online sexual abuse. They recommend ‘client-side scanning technology’ which would be placed on mobile phones and other electronic devices to detect indecent images.

Additionally, Levy and Robinson proposed running ‘language models’ on phones and other devices to detect language associated with grooming. The software would warn and nudge potential victims to report risky conversations to a human moderator.

Within the last two years, Apple attempted to introduce client-side scanning technology (known as Neural Hash) to detect known child sexual abuse images on iPhones. If an image was detected the iCloud account would be disabled and the user reported to National Centre for Missing and Exploited Children and law enforcement.

However, there was an outcry that this technology would ‘create serious security and privacy risks for all society whilst the assistance it can provide law enforcement is at best problematic’ as stated by leading computer scientists at Columbia University. So, in September 2021, Apple announced that they would be postponing the roll out of Neural Hash.

There has been a similar response to Levy and Robinson’s proposals in their recent paper. The Open Rights Group (an internet campaign group) described Levy and Robinson’s proposals as ‘a step towards a surveillance state’.

However, Levy and Robinson argue that developments in technology mean that there does not have to be a choice between privacy and security from end-to-end encryption and the risk of child sexual abusers not being identified. As end-to-end encryption ‘fundamentally breaks’ most of the safety systems relied on by law enforcement to protect and prosecute individuals.

The paper and reactions to it highlight the ongoing tension between privacy and public protection. IICSA in its final report includes its thoughts related to online abuse and recommended more robust age-verification requirements for the use of online platforms and services. In a separate blog we will consider IICSA’s comments further.