Apple's New Child-Protection Features
A few key points on image detection/reporting:
Detection happens on your device, not in iCloud
Cryptographic hashes of each images are compared with a database of known images only
The metadata is uploaded with each photo, and that’s what Apple uses to report potential abuse
A few key points on new parental controls in Messages:
Apple uses on-device machine learning to detect nudity
The user/child is given the option not to look at and/or distribute the image first
For accounts of children 12 and under, the parent can be notified if their child decides to proceed
This is one of those incredibly murky issues that do not have clear answers. It’s a tradeoff between protecting children and protecting privacy. Ignoring either enables terrible outcomes—on the one hand by sexual predators, on the other by abusive governments.
Ben Thompson of Stratechery thinks that Apple made a mistake in their approach because they violated the assumption that the user owns their device; a better tradeoff would be to scan in iCloud, similar to how Facebook scans images uploaded to their servers.
As far as I know, Facebook cannot scan images in Messenger when end-to-end encryption is enabled since the scanning happens in the cloud—not on the device.
Ben Thompson and John Gruber had a good conversation about Apple’s future privacy plans on a recent episode of Dithering (Podcast). They speculated that Apple’s decision to perform more work on-device (rather than in iCloud) could be laying the groundwork for enabling end-to-end encryption in iCloud in the future. Currently, iCloud backups are not encrypted, making them available to requests from law enforcement.
I’m glad that Apple is doing something, and that their solution doesn’t make it easy to get around the new protections by default—like Facebook’s encryption option. Ben Thompson makes a good point: maybe it would have been better not to compromise the device, and do the scanning in iCloud (which parents could choose to enable).
There were no rules at the start of the internet, but there were also few users, and the technology itself made it difficult for criminals to hide. It’s not the same place it once was. As in many societal debates, we need to find the right balance between safety and freedom.
From the kid’s perspective, I’m sure they’ll find creative ways to circumvent any censorship/controls their parents put in place. We don’t have to make it easy, though. Give them a challenge.