Aug 15, 2021 5 min read

Apple's child protection measures spark a privacy debate

Apple's child protection measures spark a privacy debate
Source: Apple.com

Wiser! Essays: Apple's child protection measures spark a privacy debate. Meanwhile Apple move quick to allay concerns of privacy groups and the WhatsApp CEO!


Apple has been using cryptographic hashing systems to scan for images of child sex abuse material ("CSAM") for years, but only in emails. This is when one person sends the abusive image to another, or posts the image online.

Last week, Apple confirmed plans to roll out a new system to tackle CSAM, along with several other child protection measures. The system will be rolled out in the US this Autumn and will scan all images automatically across all Apple devices.

Using a cryptography hashing system, every image will be matched against the National Center for Missing or Exploited Children (NCMEC) in the United States. This is a central database of child sex abuse images run by this private nonprofit organization established by Congress in 1984.

rick huckstep wiser newsletter https://rickhuckstep.com

The Privacy debate  

The new feature has triggered some hefty debate about privacy because the new Apple feature will scan all images regardless of whether the images are ever shared with anyone else.

Will Cathcart is the CEO of Whatsapp and he went onto Twitter with a long thread that started...

Cathcart's point is that this is an invasion of privacy. And that this is a system that could be used as a back door for other forms of surveillance once it is in place. Especially in jurisdictions outside of the United States.

Cathcart seems to have conveniently forgotten that Edward Snowden blew the whistle on illegal US Government surveillance of its citizens in 2013.

rick huckstep wiser newsletter https://rickhuckstep.com
Source: MacRumours

Expanded protections for children

Apple also confirmed two other features to improve child safety.

One is a new filter system for certain keywords on Siri and in device search functionality.

The other is a nudity warning in iMessage. Called the “communication safety” feature, it applies machine learning within the user's smartphone or iPad to identify and blur sexually explicit images that have been received in the messaging app. The feature will send an alert to a parent if a child under the age of 12 decides to view or send such an image.

However, it is the image scanning feature that has triggered privacy groups and the reaction from the likes of WhatsApp CEO Cathcart.

In response, Apple has explained it has safeguards in place to prevent these systems from being used for other purposes.

Apple has also stated clearly that its list of banned images is provided "by the National Center for Missing and Exploited Children (NCMEC) and other child safety organizations and that the system only works with CSAM image hashes provided by NCMEC and other child safety organizations".

rick huckstep wiser newsletter https://rickhuckstep.com

Apple explains itself

Within days, Apple's Head of Privacy, Erik Neuenschwander, gave an interview to Tech Crunch and explained Apple's approach to these child safety features.

When asked if Apple was trying to demonstrate to governments and agencies around the world that it is possible to scan for illicit content while preserving user privacy, Neuenschwander explained:

"We're motivated by the need to do more for child safety across the digital ecosystem, and all three of our features, I think, take very positive steps in that direction. At the same time we're going to leave privacy undisturbed for everyone not engaged in the illegal activity."

He was asked if Apple had created a framework that could be used for law enforcement to scan for other kinds of content in users' libraries and if it undermines Apple's commitment to end-to-end encryption.

It doesn't change that one iota. The device is still encrypted, we still don't hold the key, and the system is designed to function on on-device data."

Neuenschwander was then asked if Apple could be forced to comply with laws outside the United States that may force it to add things that are not CSAM to the database to check for them on-device, to which he explained that there are a "number of protections built-in" to the service.

"The hash list is built into the operating system, we have one global operating system and don't have the ability to target updates to individual users and so hash lists will be shared by all users when the system is enabled. And secondly, the system requires the threshold of images to be exceeded so trying to seek out even a single image from a person's device or set of people's devices won't work because the system simply does not provide any knowledge to Apple for single photos stored in our service. And then, thirdly, the system has built into it a stage of manual review where, if an account is flagged with a collection of illegal CSAM material, an Apple team will review that to make sure that it is a correct match of illegal CSAM material prior to making any referral to any external entity. And the last point that I would just add is that it does still preserve user choice, if a user does not like this kind of functionality, they can choose not to use iCloud Photos and if iCloud Photos is not enabled, no part of the system is functional.

Neuenschwander continued that for users who are "not into this illegal behaviour, Apple gain no additional knowledge about any user's cloud library," and "it leaves privacy completely undisturbed."


In other words, Apple isn't going to report a mum for taking a harmless and endearing pic of their child in the bath. But they will if they find any images on a device that match the central register of CSAM.

Which frankly, seem fair, reasonable and welcome to me!


Never Miss an Issue!

Sign up for the free membership of the Wiser! Newsletter and receive your copy of the weekly newsletter every Friday.

Premium Members get additional content with deep-dive INSIGHTS on the 1st and 3rd Tuesday of every month.

Sign me up!



Sources:

Extended Protection for Children, Apple

Apple's newest update is a five-alarm fire for your digital privacy, MSNBC

Apple Privacy Chief Explains 'Built-in' Privacy Protections in Child Safety Features Amid User Concerns, Mac Rumours

Apple privacy head explains privacy protections of CSAM detection system, Apple Insider

Great! You’ve successfully signed up.
Welcome back! You've successfully signed in.
You've successfully subscribed to Wiser! Newsletter.
Your link has expired.
Success! Check your email for magic link to sign-in.
Success! Your billing info has been updated.
Your billing was not updated.