Apple explains how it will take action against child abuse. A significant part of the screening is already happening on the device.
“TechCrunch” reports that Apple will soon introduce new technology to scan photos uploaded to iCloud for Child Sexual Abuse Material (CSAM). Apple will implement this later this year as part of a series of technologies designed to make Apple’s products and services safer for children to use. Apple has since set up a child safety page describing this new feature and how it will work.
Hash values already created on the iPhone
To protect user privacy, the company relies on a new technology called Neural Hash, which checks images when they are uploaded to iCloud Photos for matches to a known database of child abuse images. The technology works entirely on your iPhone, iPad, or Mac by converting photos into a unique string of letters and numbers (a “hash”). Normally, any slight change to a photo would result in a different hash value, but Apple’s technology is said to be such that small changes (such as cropping the image) still result in the same hash value.
These hashes are matched on the device against a database of hashes for child sexual abuse images. The hashes can be matched invisibly without the underlying image being known or the user being warned in any way. The results of the matching are uploaded to Apple when a certain threshold is reached. Only then can Apple decrypt the matched images, manually review the content and block a user’s account. Apple then reports the images to the National Center for Missing & Exploited Children, which forwards them to law enforcement.
In other words, it is highly unlikely that Apple will be able to arbitrarily look at any images on your iPhone. According to “TechCrunch”, Apple says there is a one in a trillion chance of a false positive, and that there will be an appeals process for anyone who believes their account has been accidentally locked. The technology is only partially optional: you don’t have to use iCloud Photos, but if you do, you can’t disable the feature.
Apple has published a technical document that describes the Neural Hash technology in detail. This new technology will be introduced as part of iOS 15, iPadOS 15, and macOS Monterey this autumn.