Skip to Content

The Lawful Access Debate Becomes the Child Safety Debate

Last week Apple announced a number of new security improvements, including the ability for users to opt-in to End-to-End (E2E) encryption for iCloud backups. Law enforcement is, predictably, upset about this. These new features certainly have the potential to reignite the debate around lawful access to devices and user data. However, we think legislation and regulations that hone in on child safety are a more likely response to increasingly law enforcement-resistant tech ecosystems.

The most significant improvement—what Apple has dubbed “Advanced Data Protection for iCloud“—rolls out E2E encryption for most data that can be backed up to iCloud. This will be an opt-in feature and will roll out initially to US users before the end of the year and globally in early 2023. iCloud Contacts, Mail, and Calendar won’t be E2E encrypted because of “the need to interoperate with the global email, contacts, and calendar systems”.

Apple also announced other security improvements will be available in early 2023, including alerts when new devices are added to iMessage conversations and allowing security keys to secure Apple IDs. These security improvements aren’t at all groundbreaking but are significant given Apple’s prominence as a consumer device vendor and given its positioning as a privacy-first technology giant.

The company has also confirmed its since mothballed plans for on-device scanning for Child Sexual Abuse Material (CSAM), which were announced last year, are officially dead:

We have further decided to not move forward with our previously proposed CSAM detection tool for iCloud Photos. Children can be protected without companies combing through personal data, and we will continue working with governments, child advocates, and other companies to help protect young people, preserve their right to privacy, and make the internet a safer place for children and for us all.

However, Apple still plans to introduce other safety features that will pop up warnings for children who send or receive photos that contain nudity, for example.

Communication safety in Messages app.

The on-device CSAM-scanning technology was intended to tip off authorities if previously identified child exploitation material was found in a shared iCloud library. At the time, this new on-device scanning didn’t really seem necessary because iCloud wasn’t E2E encrypted. Apple could have scanned for CSAM server-side, the process typically used at companies that handle user-generated content. Given E2E encrypted iCloud backups, however, on-device scanning absolutely makes sense as a kind of compensating control.

The situation now, however, is that E2E backup encryption is being rolled out without increased protections against the spread of CSAM.

Current US legislation doesn’t require companies to actively search for CSAM, but they have to report it to the National Center for Missing and Exploited Children if they find it. There is an international trend, however, towards legislation that forces companies to be more proactive when it comes to CSAM and online harms.

Australia’s Online Safety Bill 2021, for example, requires that companies meet Basic Online Safety Expectations which are essentially set by Ministerial decree and so can be amended easily. These expectations don’t currently require scanning for CSAM but instead require that “reasonable steps” be taken to minimise a variety of online harms that include child sexual exploitation material and behaviours such as grooming.

Draft UK, US, and EU legislation has similar intent. The language varies, but providers are required to do more to combat CSAM. Provisions that require scanning of E2E encrypted services when other mitigations are proven inadequate are explicitly spelled out. In one sense, scanning of E2E encrypted communications is being used as a stick, and lawmakers are effectively saying “find something that works against CSAM or else”.

Although this is all draft legislation, we think stricter regulation is inevitable. It’s justified, too, when we consider that companies could and should be doing more already.

Countering CSAM without putting E2E encrypted messaging at risk is achievable. Many approaches to this are captured in this report examining Meta’s expansion of E2E messaging from a human rights perspective. Many of the suggestions have some sort of privacy impact — using metadata and behavioural analysis to identify problematic behaviour, for example — but we think that more action is justified here if it can mitigate risks to children.

Apple is well placed to weather all this. It has rolled out E2E encryption for iCloud backups but has its on-device scanning technology waiting in the wings if legislation requires it to implement some sort of CSAM scanning.

Aside from the implications around CSAM, Apple’s announcement went down like a lead balloon at the FBI. It told The Washington Post it was “deeply concerned with the threat end-to-end and user-only-access encryption pose[s]”. It continued:

This hinders our ability to protect the American people from criminal acts ranging from cyber-attacks and violence against children to drug trafficking, organized crime and terrorism. In this age of cybersecurity and demands for ‘security by design’, the FBI and law enforcement partners need ‘lawful access by design’.

We don’t see the FBI getting much traction here. The policy debate has shifted to child protection as a priority and the public is now well aware that law enforcement agencies have other options to access phones covertly.

But beyond upsetting domestic law enforcement agencies, there is an international angle here too.

Apple keeps its China-based user’s iCloud data locally in the PRC, including encryption keys, in a data centre run by a state-owned mobile operator. Apparently, this was the price of doing business in China.

Surprisingly, Apple’s software chief Craig Federighi told The Wall Street Journal he expected that E2E encrypted iCloud backups would be rolled out to Chinese users, saying “We believe so, we want to roll out across the world”. When asked about the Chinese government’s view, however, Federighi happily declared “They’ve not told me!”

We are not entirely convinced that it will happen, but there is an argument to be made that iCloud data localisation was just as much about preventing US spying rather than enabling domestic surveillance. In the wake of recent protests against strict Covid policies, for example, The New York Times reports facial recognition and phone tracking technology were used to identify protestors. And given the country’s internet-based censorship and tracking, it simply has a lot of alternative ways to get information about individuals of interest. In other words, access to iCloud data may be a nice-to-have rather than a must-have for the PRC.

In the meantime, all eyes will be on western lawmakers who must decide whether to reignite the lawful access debate or settle for regulations designed to suppress the worst types of crimes against children.

    Ads Blocker Image Powered by Code Help Pro

    It looks like you are using an adblocker.

    Ads keep our content free. Please consider supporting us by allowing ads on pupuweb.com