204 with 115 posters participating
Shortly after reports today that Apple will start scanning iPhones for child-abuse images, the company confirmed its plan and provided details in a news release and technical summary.
“Apple’s method of detecting known CSAM (child sexual abuse material) is designed with user privacy in mind,” Apple’s announcement said. “Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC (National Center for Missing and Exploited Children) and other child safety organizations. Apple further transforms this database into an unreadable set of hashes that is securely stored on users’ devices.”
Apple provided more detail on the CSAM detection system in a technical summary and said its system uses a threshold “set to provide an extremely high level of accuracy and ensures less than a one in one trillion chance per year of incorrectly flagging a given account.”
The changes will roll out “later this year in updates to iOS 15, iPadOS 15, watchOS 8, and macOS Monterey,” Apple said. Apple will also deploy software that can analyze images in the Messages application for a new system that will “warn children and their parents when receiving or sending sexually explicit photos.”
Apple accused of building “infrastructure for surveillance”
Despite Apple’s assurances, security experts and privacy advocates criticized the plan.
“Apple is replacing its industry-standard end-to-end encrypted messaging system with an infrastructure for surveillance and censorship, which will be vulnerable to abuse and scope-creep not only in the US, but around the world,” said Greg Nojeim, co-director of the Center for Democracy & Technology’s Security & Surveillance Project. “Apple should abandon these changes and restore its users’ faith in the security and integrity of their data on Apple devices and services.”
For years, Apple has resisted pressure from the US government to install a “backdoor” in its encryption systems, saying that doing so would undermine security for all users. Apple has been lauded by security experts for this stance. But with its plan to deploy software that performs on-device scanning and share selected results with authorities, Apple is coming dangerously close to acting as a tool for government surveillance, Johns Hopkins University cryptography Professor Matthew Green suggested on Twitter.
The client-side scanning Apple announced today could eventually “be a key ingredient in adding surveillance to encrypted messaging systems,” he wrote. “The ability to add scanning systems like this to E2E [end-to-end encrypted] messaging systems has been a major ‘ask’ by law enforcement the world over.”
Organizations around the world have cautioned against client-side scanning because it could be used as a way for governments and companies to police the content of private communications.
Apple’s technology for analyzing images
Apple’s technical document on CSAM detection includes a few privacy promises in the introduction. “Apple does not learn anything about images that do not match the known CSAM database,” it says. “Apple can’t access metadata or visual derivatives for matched CSAM images until a threshold of matches is exceeded for an iCloud Photos account.”
Apple’s hashing technology is called NeuralHash and it “analyzes an image and converts it to a unique number specific to that image. Only another image that appears nearly identical can produce the same number; for example, images that differ in size or transcoded quality will still have the same NeuralHash value,” Apple wrote.
Before an iPhone or other Apple device uploads an image to iCloud, the “device creates a cryptographic safety voucher that encodes the match result. It also encrypts the image’s NeuralHash and a visual derivative. This voucher is uploaded to iCloud Photos along with the image.”
here. Apple also published a longer and more detailed explanation of the “private set intersection” cryptographic technology that determines whether a photo matches the CSAM database without revealing the result.