In the aftermath of the Paris attacks, encryption technology has once again faced widespread criticism for potentially aiding terrorist actions.
But all too often the debate around digital privacy and security frames cryptography as a binary of two extremes: either we have strong cryptography that enables truly secret communications or we have safety from terrorists with pervasive Internet surveillance.
Given our current discourse on the topic, it will be nearly impossible to strike the right balance of privacy and lawful access to information. The myths and fears stemming from both side of the debate make useful discussion impossible because we are left with a choice between a mythical state of privacy and a dystopian present where security trumps online rights.
Over the last few years, advocacy for strong encryption has prompted companies to encrypt communications by default. Left unchecked, it seems that encryption will continue to prevent legitimate criminal and national security investigations. With a public still angry about the surveillance overreach revealed by Edward Snowden, the government’s worries about the unchecked expansion of cryptography have largely fallen on deaf ears.
That's why there's a desperate need to reframe the debate around encryption – and that starts with rejecting advocacy for pervasive and ubiquitous cryptography as well as the overreaching state demands for wholesale surveillance. Instead, solutions should leverage strong democratic controls and collective decision-making.
So-called "cyberlibertarians" have fabricated a lie that our digital communications infrastructure was built and thrived as a public sphere – free from security and surveillance, and fundamentally open to all. Mr. Snowden reminisced about using the Internet before the intrusion of the state – a historical fiction. Organizations such as the Electronic Frontier Foundation promote manifestos about the "independence of cyberspace," which are part of the same mythology.
The truth is that there has always been a negotiation between Internet users, state actors, and private firms over how Internet infrastructure will be put to use, and what kind of privacy resources can be drawn on.
Government actors have claimed that strong cryptography will enable terrorists to operate freely behind a veil of encrypted communication. After the Charlie Hebdo attacks earlier this year, British Prime Minister Cameron and President Obama sought to create a legally mandated "backdoor" in cryptographic technologies by including a mechanism that permits legal surveillance. Mr. Obama later backed down from this position, claiming that intelligence agencies would work with technology companies to find a balance of privacy and lawful access. Now, in the wake of the Islamic State attacks in France, lawmakers around the world are renewing campaigns to fight against growing use of encryption.
In many ways, the cryptography battle parallels the divisive gun control debate. The problem with both of these arguments is that they are premised on the illusion that technology is neutral.
This view is epitomized by the National Rifle Association's claim that "guns don’t kill people, people kill people," when, in fact, technology is the intersection of social and human values encouraged by its use. Instead, we can choose the technologies we live with, and determine how we want to interact with them through democratic controls.
We should approach security and privacy technologies with this mindset.
There's already been a shift in the gun debate toward more reasoned solutions and compromises. The cryptography debate, however, has not even progressed this far (the best accounts at least recognize that the privacy and security binary is a false dichotomy). Instead, the EFF looks like an outdated NRA with its unflinching promotion of cryptography, all the while many state actors are actively working to attack and debase their efforts. Worse still, the recommendation for a cryptographic "backdoor" misses the target entirely, advocating a technological fix for a social problem.
The idea that we can solve the issue with cryptography by creating better or subtler cryptography is akin to asking the gun manufacturers for more accurate guns.
So, what's the answer? First, we should recognize that privacy and security have historically been understood as trade-offs amidst efforts to sustain peaceful society in a dangerous world. Our countries have spies, population statisticians, emergency powers laws, and police capable of search and seizure. Only fringe elements would argue for dismantling these mechanisms that help assure a social contract and good behavior. But when it comes to digital communications, and when it comes to these same trade-offs in cyberspace, we retreat to fantasy worlds.
Second, we should recognize that the Internet may give the appearance of a public sphere, but it is also and more importantly an infrastructure. It is an infrastructure created by and for governments, subsequently transferred to the private sector, and maintained by engineering task forces and other technocratic groups. The notion that there is a "cyberspace" or "clouds" is unrealistic. In fact, online rights are an illusion that mask marketing decisions. The Internet’s routers, protocols, fiber optic cables, and platforms are all owned – and in no way free.
This doesn't mean we should slide into political apathy and merely accept what private and state actors are selling, but we should not expect an "independence of cyberspace" either. Resisting this illusion means understanding how social, economic, and state actors actively constitute communications infrastructure.
Reframing the cryptography debate would mean thinking about privacy not as a fundamental property of some kind of ethereal "cyberspace," but rather as an ongoing negotiation of built infrastructure and social values.
There are no easy solutions. But moving the debate away from the current divisiveness suggests some possibilities.
Immediately, and practically, there is good evidence that "old fashioned" police work is effective. Surveillance dragnets and remote code execution is convenient, but some things are meant to be hard, and this is OK. When a water main bursts it takes time, money, and human effort to fix, but rather than seeing these actions on our infrastructure as evidence of government overreach, we recognize their immediate need and public benefit.
Similarly, we do not tend to tolerate failures within – or attacks on – our critical infrastructure. Public investment to strengthen and protect Internet infrastructure should be no different. And, as an infrastructure like any other, control over the Internet should be subject to democratic control. We do not expect a "hands-off" approach to our water or roads, so why expect it with digital communications?
Finding a balance between private interests, state control, and democratic interventions should be the goal of modern Internet governance.
Quinn DuPont studies cryptography and technoculture at the University of Toronto’s Faculty of Information. Previously, he worked at IBM. Follow him on Twitter @quinndupont.