116 3rd St SE
Cedar Rapids, Iowa 52401
Apple’s surveillance is troubling, despite good intentions
While this system has a noble goal, it’s ripe for abuse.
Ben Kaplan
Aug. 23, 2021 6:00 am
Apple has spent years promising consumers “what happens on your iPhone, stays on your iPhone.” Controversial features built into iOS 15 walk back that promise, and will turn your phone into a surveillance device.
On Aug. 5 Apple announced three new features to combat childhood sexual abuse material (CSAM). The first is uncontroversial, if you ask Siri about CSAM she’ll direct you to get treatment. The second was a suite of new iMessage features that will alert parents of children under 12 if their child receives a sexually explicit photo and they open it, and will blur sexual images for accounts of people under 17 unless they ask to view the image. Teenager’s accounts will not alert parents’ accounts if they view the sexually explicit photos. These features are opt-in for parents, and do not share message content between accounts.
“It opens the door for all sorts of other surveillance.”
The third feature is much more controversial, invasive and technically complicated. Beginning with iOS 15 Apple will include technology on your iPhone that scans your iCloud Photo Library for CSAM, and if it finds enough will report that back to Apple. This only happens if you use iCloud to store photos, but the scanning happens on your personal device.
The tech behind this is complicated. I’ll try and explain as simply as I can. Apple has partnered with National Center for Missing and Exploited Children (NCMEC), who holds a database of CSAM. Apple has created a technology called Neuralhash, which creates what are essentially serial numbers for each CSAM image. When you upload photos to iCloud Neuralhash will scan your images and generate serial numbers. If it finds a certain number of matches between the NCMEC database and photos you’ve uploaded to iCloud, it will disable your account, send those images back to Apple for “human review,” and then send them to NCMEC.
CSAM is the most disgusting and objectionable material in the world. It’s hard to argue that a system that roots out and reports people who distribute it is a bad thing. But while this system has a noble goal it’s ripe for abuse.
Bruce Schneier, a cybersecurity expert and board member at the Electronic Frontier Foundation said, “It opens the door for all sorts of other surveillance,” and described the new features as a “security disaster.” His sentiments have been echoed by dozens of other cybersecurity experts since Apple announced these features.
It’s important to understand what Apple is doing here, because while they are initially deploying this technology to be used against CSAM, it could be applied much more broadly.
Google, Facebook and other major cloud storage providers already scan for CSAM on their servers. Apple doesn’t. Server side CSAM scanning is over a decade old and uncontroversial. When you upload a photo to these services they use software similar to Nueralhash to check for CSAM, but they don’t look at the content on your phone. Just the content you upload.
Instead of deploying server side CSAM scanning Apple built a backdoor into iOS. Instead of scanning you at the airport, TSA is inside your house rifling through your stuff.
Starting with iOS 15 Apple will include a black box database of offensive material on your phone. If you use Apple’s cloud services your phone will scan your data to see if it matches this offensive material, and if it does Apple will report you to the authorities. Currently the scan only happens if you use iCloud photos, but that’s a policy choice, not a technological limitation. The scan happens on your phone, using your phones processor, against a database stored on your phone. Apple could at any point in time decide to simply change the system to scan the contents of your phone whenever.
Apple has said they will simply refuse to add content besides CSAM to the database, and refuse to change the system to scan without using iCloud. That’s not good enough. They’re debuting this tech in the U.S. but plan to roll it out in other countries. Apple has consistently bent to the will of China to undermine security and privacy features in that country, and now we have to take their word they won’t let China or other authoritarian regimes abuse this feature.
The problem with this technology is it makes users completely dependent on Apple to behave ethically and honestly about how this technology is deployed while turning their phones into surveillance tool.
Apple has spent the last few years aggressively adding privacy controls to their products and marketing those features. This change is a complete reversal of their commitment to user privacy. Starting with iOS 15 your iPhone, if you use iCloud photos, will actively surveil the content of your device, on your device. That’s a paradigm shift in user privacy.
Ben Kaplan is a freelance photographer and writer who lives in Cedar Rapids.
(Bloomberg photo by David Paul Morris)
Opinion content represents the viewpoint of the author or The Gazette editorial board. You can join the conversation by submitting a letter to the editor or guest column or by suggesting a topic for an editorial to editorial@thegazette.com