TechCrunch confirmed that Apple will soon be rolling out new technology to scan photos uploaded to iCloud for child sexual abuse material (CSAM). The rollout will take place later this year as part of a collection of technologies designed to make its products and services safer for children. Apple’s goal: child safety.
The company therefore relies on a new technology called NeuralHash. This technology will check if an image uploaded to iCloud matches known child abuse images, without decrypting the photo. It works entirely on your iPhone, iPad or Mac by converting photos into a single string of letters and numbers (a “hash”). Normally the smallest change to a photo would result in a different hash. However, Apple’s technology is said to be such that small changes (like a crop) would still result in the same hash.
These hashes would be matched on the device to a hash database for child sexual abuse images. Hashes can be matched invisibly, without knowing what the underlying image is or alerting the user in any way. Match results are uploaded to Apple if a certain threshold is reached. Only then can Apple decrypt the corresponding images, manually verify the content, and deactivate a user’s account. Apple will then report the images to the National Center for Missing and Exploited Children, who will then forward them to the police.
What about privacy?
In other words, Apple is extremely unlikely to have the ability to decrypt a user’s photos. According to TechCrunch, Apple says there’s a one in a trillion chance of a false positive, and there will be an appeal process in place for anyone who believes their account has been reported in error. The technology is only optional. Well almost. You don’t need to use iCloud Photos, but if you do, you won’t be able to turn off the feature.