TechCrunch confirmed that Apple will soon be rolling out new technology to scan photos uploaded to iCloud for child sexual abuse material (CSAM). The rollout will take place later this year as part of a collection of technologies designed to make its products and services safer for children. Apple’s goal: child safety.
Most cloud services already scan images for material that violates their terms of service or the law, including CSAM. They can do this because, while the images can be stored encrypted, companies have the encryption key. Apple’s long-standing privacy policy gives users the option to encrypt photos on their device and store them with Apple. All without Apple having the ability to decipher them.
The company therefore relies on a new technology called NeuralHash. This technology will check if an image uploaded to iCloud matches known child abuse images, without decrypting the photo. It works entirely on your iPhone, iPad or Mac by converting photos into a single string of letters and numbers (a “hash”). Normally the smallest change to a photo would result in a different hash. However, Apple’s technology is said to be such that small changes (like a crop) would still result in the same hash.
These hashes would be matched on the device to a hash database for child sexual abuse images. Hashes can be matched invisibly, without knowing what the underlying image is or alerting the user in any way. Match results are uploaded to Apple if a certain threshold is reached. Only then can Apple decrypt the corresponding images, manually verify the content, and deactivate a user’s account. Apple will then report the images to the National Center for Missing and Exploited Children, who will then forward them to the police.
What about privacy?
In other words, Apple is extremely unlikely to have the ability to decrypt a user’s photos. According to TechCrunch, Apple says there’s a one in a trillion chance of a false positive, and there will be an appeal process in place for anyone who believes their account has been reported in error. The technology is only optional. Well almost. You don’t need to use iCloud Photos, but if you do, you won’t be able to turn off the feature.
Apple has published a technical article detailing this technology NeuralHash. This new technology will be rolled out as part of iOS 15, iPadOS 15, and macOS Monterey this fall.