Strong encryption provides privacy and security for everyone online. We can’t have private conversations, or safe transactions, without it. Encryption is critical to democratic politics and reliable economic transactions around the world. When a company rolls back its existing commitments to encryption, that’s a bad sign.
In August, Apple made a startling announcement: the company would be installing scanning software on all of its devices, which would inspect users’ private photos in iCloud and iMessage.
This scanning software, intended to protect children online, effectively abandoned Apple’s once-famous commitment to security and privacy. Apple has historically been a champion of encryption, a feature that would have been undermined by its proposed scanning software. But after years of pressure from law enforcement agencies in the U.S. and abroad, it appears that Apple was ready to cave and provide a backdoor to users’ private data, at least when it comes to photos stored on their phones.
At EFF, we called out the danger of this plan the same day it was made public. There is simply no way to apply something like “client-side scanning” and still meet promises to users regarding privacy.
Apple planned two types of scanning. One would have used a machine learning algorithm to scan the phones of many minors for material deemed to be “sexually explicit,” then notified the minors’ parents of the messages. A second system would have scanned all photos uploaded to iCloud to see if they matched a photo in a government database of known child sexual abuse material (CSAM). Both types of scanning would have been ripe for abuse both in the U.S., and particularly abroad, in nations where censorship and surveillance regimes are already well-established.
EFF joined more than 90 other organizations to send a letter to Apple CEO Tim Cook asking him to stop the company’s plan to weaken security and privacy on Apple’s iPhones and other products. We also created a petition where users from around the world could tell Apple our message loud and clear: don’t scan our phones!
More than 25,000 people signed EFF’s petition. Together with petitions circulated by Fight for the Future and OpenMedia, nearly 60,000 people told Apple to stop its plans to install mass surveillance software on its devices.
In September, we delivered those petitions to Apple. We held protests in front of Apple stores around the country. We even flew a plane over Apple’s headquarters during its major product launch to make sure its employees and executives got our message. After the unprecedented public pushback, Apple agreed to delay its plans.
A Partial Victory
In November, we got good news: Apple agreed to cancel its plan to send notifications to parent accounts after scanning iMessages. We couldn’t have done this without the help of tens of thousands of supporters who spoke out and signed the petitions. Thank you.
Now we’re asking Apple to take the next step and not break its privacy promise with a mass surveillance program to scan user phones for CSAM.
Apple’s recent ad campaigns, with slogans like “Privacy: That’s iPhone,” have sent powerful messages to its more than one billion users worldwide. From Detroit to Dubai, Apple has said it in dozens of languages: the company believes privacy is “a fundamental human right.” It has sent this message not just to liberal democracies, but also to people who live in authoritarian regimes, and countries where LGBTQ people are criminalized.
It’s understandable that companies don’t want users to misuse their cloud-based systems, including using them to store illegal images. No one wants child exploitation material to spread. But rolling back commitments to encryption isn’t the answer. Abandoning encryption to scan images against a government database will do far more harm than good.
As experts from around the world explained at the EFF-hosted Encryption and Child Safety event, once backdoors to encryption exist, governments can and will use them to go well beyond scanning for CSAM. These systems can and will be used against dissidents and minorities. We hope Apple will sidestep this dangerous pressure, stand with users, and cancel its photo scanning plans.
This article is part of our Year in Review series. Read other articles about the fight for digital rights in 2021.