Research Update

April 15, 2023

Balancing Privacy and Protection: CSAM Detection

Project aims to develop a privacy-preserving, easily auditable, on-device neural hash algorithm for the detection of known CSAM images.

Project Dates: February 2023 - May 2023

Built with: SwiftUI

A screen informing a user that their iCloud Photos account is restricted after uploading a series of pictures matched to known CSAM material.

The National Center for Missing and Exploited Children maintains a database of known illegal child pornography (CSAM). Tech companies like Google, Microsoft and Meta use that database to detect suspicious materials uploaded to their servers and report them to authorities.

Previous efforts to address the issue of detecting CSAM online have focused on two main approaches: using a database of known illegal content to scan uploaded photos, and relying on user or authority reports to identify suspicious material. While the former approach is effective in identifying illegal content, it raises significant privacy concerns as it requires access to all user photos. The latter approach is less invasive but suffers from low reporting rates and the potential for bad actors to evade detection. Apple's attempt to use an algorithm to scan photos on a user's device before they are uploaded to online servers was a unique attempt to address the privacy concern, but the proposed system faced significant backlash and ultimately was not implemented.

The study aims to identify potential solutions that can address both concerns and contribute to a better understanding of the trade-offs involved in detecting CSAM material online.

A new framework of CSAM detection

After considering the philosophical underpinnings of privacy rights, and performing user focus groups, I developed a potential paradigm shift in law enforcement online by restricting the upload of CSAM for dissemination without disclosing any user data. The proposal utilizes Apple's proposed and well-documented Neural Hash algorithm. It also adopts the same on-device detection approach, but takes it a step further: instead of reporting the results to Apple after a threshold is met, it blocks the upload of suspicious material to iCloud and alerts the user.

Detection

If while processing for upload to iCloud, one picture's hash matches that of one of the known CSAM, the picture is not uploaded and the user is promptly notified. They can instantly see what photo was flagged and appeal it. No data is shared with Apple, but the sharing and distribution of CSAM is hindered. The system is instantly auditable, as users can see and report to others what photos are being blocked.

A screen informing a user that one of their photos has been matched to known CSAM and not uploaded to iCloud.

Account restriction

If the account has tried to upload more than a certain threshold of pictures matched to known CSAM, their account is restricted, in that they are not able to further upload any pictures to iCloud. This is important to ensure that criminals do not take advantage of the instant feedback system to develop ways to evade detection. Once again, the account restriction is device-based and Apple is not notified, but the sharing and distribution of CSAM is hindered. The user can continue to use the rest of iCloud's services, and store pictures locally. Uploaded iCloud photos are not affected. The users should be able turn off warnings and markings in settings to protect their privacy, if desired (say when sharing their phone with others).

Appeal and account restoration

To restore their account, users can consent to send their pictures to Apple for human review, similar to the previous system's process. Notice that the system here is opt-in, and the sharing is clearly defined and explained, along with the potential consequences. This is the first time Apple might hear about a case (besides users appealing individual photos).

A screen of the appeal process that informs the user that the matched pictures will be sent to Apple for review and account restoration.

Next steps

I have built a Swift Playgrounds project as a prototype for my solution to be used in a user evaluation study to validate whether the conclusions extracted from the original user focus groups and used to develop this new framework. See the resources section to download the Swift Playgrounds project.

Resources

Swift Playgrounds file

Swift Playgrounds file

View

View

©2024 Evangelos Kassos

©2024 Evangelos Kassos