Press "Enter" to skip to content

Your Phone Is Your Private Space

Privacy is a set of curtains drawn across the windows of our lives. And technology companies are moths that will chew through more of the fabric every year if we let them, and especially if we encourage them.

An American who stores accumulated photographs in a spare bedroom or attic or self-storage space correctly presumes that those albums of visual keepsakes are off-limits to other people. Though an exhaustive inventory of the nation’s snapshots would include a minuscule percentage showing illegal acts, it would be outrageous if a photo-album manufacturer or a storage-facility employee insisted that, like it or not, their business would be investigating all of your photos, just to make sure that nothing in your collection was unlawful or immoral. We expect to be presumed innocent of nefarious acts, absent evidence to the contrary—and to be spared public or private agents rifling through our personal effects.

But that norm may not survive the digital era. For now, Apple has backed off its plan, announced last month, to install scanning software on all U.S. iPhones to help identify the vile but tiny subset of people who keep child pornography on their devices. The company will take time to “make improvements” on the idea, The Wall Street Journal reported. But ultimately, whether to proceed is up to the company’s discretion.

Apple’s original plan was to scan all the images of customers with the company’s iCloud storage service activated. Suspected sex offenders were to be flagged and reported to the relevant authorities. “Most cloud providers already scan user libraries for this information,” TechCrunch explained after the announcement. “Apple’s system is different in that it does the matching on device rather than in the cloud.”

The civil libertarians who objected to the plans raised questions that remain important: Do we want a society in which the devices that we purchase—and that we must carry to navigate the modern world—are equipped to continuously and remotely monitor us for criminal acts? Or should Americans enjoy undiminished privacy in our personal effects, including photos, when we keep them in digital rather than physical form?

[Read: Apple’s empty grandstanding about privacy]

Even if technology companies load all consumer devices with intrusive spyware, sex crimes won’t go away, alas. Images of child sex abuse long predate digital photography, and even in the digital sphere, the technological arms race between criminals and cops will never end. It will grow only more sophisticated and distant from the computing practices of the masses. But the typical person’s sphere of personal privacy will shrink dramatically if we acclimate to a future in which everyone is regularly subject to digital stop-and-frisks.

“We’ve had personal computers for decades and there has never been a mandate to scan the private content of all desktops, laptops or phones globally for unlawful content,” Will Cathcart, the head of the Facebook-owned messaging app WhatsApp, declared on Twitter. “It’s not how technology built in free countries works.”

Under the system that Apple outlined last month, the initial intrusion on privacy would be minimal. If a user deactivated iCloud, the software on their device wouldn’t process its images at all, because its ostensible purpose would be to stop child-sex-abuse images from being uploaded to iCloud. And for the majority of iPhone owners who use iCloud, images would be made known to Apple only if the software flagged them as matching ones in nonprofit groups’ databases of material known to be exploitative.

If the matter ended there, it would be of little consequence, good or bad. All but the least technologically savvy predators would simply deactivate iCloud and avoid getting caught, sharply reducing the public benefit of this change. As for future costs, however, Apple’s proposed approach embraced at least three worrisome premises: that we don’t fully own the devices that store so much private information about us; that tech giants that sell us those devices can ethically load them with spyware; and that the evil deeds of a tiny fraction of users justify the mass surveillance of data that millions of totally innocent users put on their phone.

If Apple accepts those premises, and most of its customers go along without objecting, then future iPhones will almost inevitably scan for more than child porn. The logic of catching a few evil actors by denying the cloak of privacy to everyone will inexorably expand to more and more areas that powerful societal factions want to target. Some of those factions will themselves be evil. Many are likely to be illiberal or repressive.

As Edward Snowden, who famously revealed the NSA’s mass surveillance of innocent people, asked on Substack,

What happens when a party in India demands they start scanning for memes associated with a separatist movement? What happens when the UK demands they scan for a library of terrorist imagery? How long do we have left before the iPhone in your pocket begins quietly filing reports about encountering “extremist” political material, or about your presence at a “civil disturbance”?

Geopolitical pressures of this sort are unavoidable. Multiple governments are already moving to obliterate or significantly undermine digital-privacy rights, even in liberal democracies. This summer alone, analysts at the Electronic Frontier Foundation, a leading digital-civil-liberties advocacy organization, have raised alarms about new or proposed laws in India, Brazil, and Canada.

And as Apple itself has argued on past occasions, “the only way” to guarantee that a powerful tool for surveilling devices isn’t abused and doesn’t fall into the wrong hands is “to never create it.” Yet now, Apple has effectively announced to every government in the world: We have, or can develop, the technical capacity to scan all iPhones for anything, and under sufficient pressure, we may use that power against our own customers to placate outside authorities. (The company did insist that if governments ask it to expand its spying, “Apple would refuse such demands.”)

Another risk is that the technology would flag some of the innocents that it spies on. The computer scientists Jonathan Mayer and Anunay Kulshrestha, who published a peer-reviewed paper on how to build a child-porn detection system like the one Apple is implementing, published a Washington Post op-ed that explains why they came to believe the method is dangerous. “The content-matching process could have false positives,” they wrote, “and malicious users could game the system to subject innocent users to scrutiny. We were so disturbed that we took a step we hadn’t seen before in computer science literature: We warned against our own system design, urging further research on how to mitigate the serious downsides.”

[Read: The optics of Apple’s encryption fight]

Like-minded critics of the technology have gone so far as to post open-source demonstrations of how bad actors could trick innocents into possessing innocuous images that have been manipulated to match the numeric codes, known as hashes, that nonprofit groups’ databases use to identify known child-sexual-abuse material. Using such techniques, a sophisticated foreign-intelligence service (or perhaps a teenage troll) could theoretically cause trouble for American iPhone users—which is to say, a substantial percentage of both average citizens and business, government, and academic leaders.

MIT Technology Review reported that Apple told journalists “the new systems cannot be misappropriated easily by government action—and emphasized repeatedly that opting out was as easy as turning off iCloud backup.” (The company posted a lengthy response to the broad criticism it faced after its initial announcement.)

But even if the company shelves the proposed system for now, advocacy groups, democratic governments, and dictatorships alike will continue to pressure tech companies to play a much greater role in suppressing numerous kinds of material. Without the bulwark of control over our own devices and the attendant expectation of privacy in our personal effects, users will have a far harder time protecting the zone of privacy that individuals enjoy in just societies—or conserving the democratizing effect technology has on information in authoritarian societies. Yes, government agents should aggressively pursue people who create child porn and infiltrate networks of people who sell or trade it. Facebook, YouTube, Pornhub, and every other platform on which content is publicly posted should marshal technology––perhaps this technology––to identify and remove child porn. But spyware on our personal devices goes too far.

This article was originally published in The Atlantic. Sign up for its newsletter.

source: NextGov