59a34eabe31910abfb06f308

Put this on a T-Shirt 

What is this about?

Apple announced three different "Child Safety" features [1]. The most significant feature is scanning photos on users' devices by calculating a hash for each image and comparing these hashes to hashes of known Child Sexual Abuse Material (CSAM). While we can all agree it is important to "get those bastards," and we should look out for our little ones, security experts immediately raised concerns [2,3,4]:

False positives

CSAM fingerprints are deliberately not bit-perfect, so innocent files may be flagged and uploaded to Apple's servers.

Misuse by authoritarian governments

This system can detect any image as long as the hash is present in the database. There is no technical requirement for those hashes to be CSAM. For example, it's possible to detect political campaign posters or similar images on users' devices by extending the database.

Collision attacks

Given that it's possible to generate a false positive, it is also possible to deliberately create images that match a given hash. So, for example, someone who wants to get another person in trouble can send them innocent-looking images (like images of kittens) and manipulate those images to match a hash of known CSAM.

This site is a proof of concept for collision attacks. The images of the kittens are manipulated to match the hash of the image of the dog (59a34eabe31910abfb06f308). As a result, all images shown on this page share the same hash. When these images are both hashed with the Apple NeuralHash algorithm, they return the same hash. Asuhariet Ygvar created a Github repo with instructions so you can verify this for yourself [5].

What can I do, and can I opt out?

Apple has stated that if a user doesn't use iCloud photos, no part of the CSAM detection process runs. So you can opt out by disabling iCloud photos. The downside is that your photos are no longer synced between your devices (or backed up to the cloud).

You can sign the Open Letter Against Apple's Privacy-Invasive Content Scanning Technology [6] or the petition of the Electronic Frontier Foundation [7].


Sources:

  1. https://www.apple.com/child-safety/
  2. https://twitter.com/matthew_d_green/status/1423071186616000513
  3. https://www.eff.org/deeplinks/2021/08/apples-plan-think-different-about-encryption-opens-backdoor-your-private-life
  4. https://cdt.org/insights/international-coalition-calls-on-apple-to-abandon-plan-to-build-surveillance-capabilities-into-iphones-ipads-and-other-products/
  5. https://github.com/AsuharietYgvar/AppleNeuralHash2ONNX
  6. https://appleprivacyletter.com/
  7. https://act.eff.org/action/tell-apple-don-t-scan-our-phones

Credits: