Modern field guide to security and privacy

Opinion: Why the US government must lose cryptowars 2.0

Law enforcement’s argument today is just as flawed now as it was in the 1990s. We cannot bend software or cryptography to our will – technology is science, not magic. 

|
Kevin Lamarque/Reuters/File
FBI Director James Comey testifies during a Senate Judiciary Committee hearing on 'Going Dark: Encryption, Technology, and the Balance Between Public Safety and Privacy,' July 8.

In the 1990s, I was in high school. I coded my first website as a college freshman in 1997. Two years later, I joined Geekcorps as an R&D intern. We sent tech geeks to Ghana, where they would volunteer with local businesses. I thought that was so cool. 

I had no idea that in Silicon Valley, and in Senate hearings in Washington, technologists were fighting for the right to create the basic building blocks of the Internet.

My older friends in the security world have started telling me countless battle stories about fighting "the cryptowars.” Now we chat openly at hacker conferences or their fancy corporate offices. But back then, they were building Pretty Good Privacy, known as PGP, which became one of the most widely used tools for encrypting communications. They would take their servers home at night. They thought the FBI would break into the offices and seize their code. Export controls made it illegal for them to ship this crypto code overseas, so they typed the PGP code into book form. Senior executives mailed it to a bookstore in Europe. As online e-commerce and other activities became more mainstream, the restrictions – and security pros' paranoia! – relaxed. 

Yet these stories from battle-scarred friends and advisors especially resonate because I recently founded a security company of my own. I also focus on building encryption software. Why? Because I believe that strong encryption protects valuable American intellectual property from hackers and adversaries overseas – one of the most critical problems facing both startups and large corporations. Because the American economy is now powered by online tools, and those tools need to be secure. 

But now, with FBI and National Security Agency leaders pushing Silicon Valley technologists to weaken their encryption so the US government can more easily access the protected data, it’s clear that while I may have missed the drama of the '90s, I won’t be able to escape the cryptowars redux of the 2010s. In fact, it's already affecting how I build my business. My conversations with lawyers and potential investors inevitably address the strong possibility that I will move my company (or big chunks of it) overseas. The lawyers tell me that it is the safest approach. Seasoned security executives tell me such a move would be reassuring to my customers.   

The battlefield landscape has changed since the '90s: Back then, encryption for commercial use was just starting to take off. These days, strong encryption powers our banking and e-commerce, and is increasingly implemented by major consumer tech companies. Apple said that devices running its new software would be encrypted by default. Even the company itself unable to gain access to its customers' protected data. And Google made headlines last year when it announced that "full-disk encryption," which protects user information on its Android devices, would be enabled by default.

The technology ecosystem may have changed over the last twenty years, but the ask from the national security establishment is, essentially, the same demand it made in the first cryptowars go round. Calling it “exceptional access” or a “golden key,” US officials want law enforcement offices to have special access to encrypted messages. They have relied on intercepting our communications as a way to find and prosecute criminal activity – missions they say strong encryption could thwart.

The FBI and NSA want tech companies, such as Apple and Google, to design their encryption so that the government would have a set of keys to access the otherwise secure data. Insisting that groups such as ISIS, foreign state spies and criminals here at home are taking advantage of secure communications employing encryption, FBI Director James Comey wants a "secure golden key" for law enforcement to access the content of encrypted communications if officers get a court-ordered warrant. NSA Director Adm. Michael Rogers has been more technically specific in his request, proposing a "split key” which would require the cooperation of multiple government agencies in order to use the key and decrypt the data. 

I sympathize with this. If bad actors are using the encryption provided by my own company – criminals such as, say, child pornographers or violent terrorists – I would not wish to grant them safe harbor. 

But law enforcement’s argument today is just as flawed now as it was in the 1990s. We cannot bend software or cryptography to our will. Technology is science, not magic. 

Government officials’ requests to weaken encryption are based on a fantasy of what technology could be – not the reality of what software is actually like in practice. And their backers, such as The Washington Post editorial board, are also swayed by it. Even President Obama, the same leader who has recruited top Silicon Valley talent to join him in the White House, wants to find a compromise.

The problem? It is not technically possible. There’s no such thing as a secure back door. The idea that the US government can have built-in access to encrypted data – while maintaining consumers’ security and privacy, and preserving American business – is flawed. Here’s why:  

1. The technical solution of a “golden key” would break the security of any sites or apps that are currently using best security practices. We cannot ignore the implications of weakening our websites and applications at a time when new data breaches are happening all the time. Alex Stamos, who is now Facebook's chief security officer, challenged Admiral Rogers on this issue earlier this year. “Bruce Schneier and Ed Felten and all of the best public cryptographers in the world would agree that you can’t really build back doors in crypto,” said Mr. Stamos, who at the time was Yahoo's chief information security officer. “That it’s like drilling a hole in the windshield.” 

2. Implementing a “golden key” would require tremendous resources from tech companies. It would add significant complexity to software. Complex software takes longer to build, and is much harder to test. Creating a secure software product that has insecurities deliberately built into its design is ... complex, to put it mildly. Since this “back door” is designed to be discreet rather than transparent (if the back door were easy to find, then any hacker could use it) the testing process will be even more cumbersome. Even if we had a multimillion dollar budget and a long time frame, I’m not sure how we could accomplish this at my own company. And even if we were given complex technical specs to follow while building our products, this would be a huge burden not just for me, but other startups and large companies alike.

What would happen to our economy if software across the industry became 150 percent more difficult and more expensive to build? Let’s consider the impact to innovation, jobs and the state of our economy. 

3. It is a tremendous security risk to store these "golden keys" within the government. What assurance do we have that this data will fare better than that stored at the White House or the Office of Personnel Management, which were both recently breached by hackers?

Rogers, the NSA director, proposes splitting up the keys to make it harder to hack. But that replaces this security risk with a bureaucratic nightmare. Will every startup be required to file keys with multiple government agencies? Most startups can barely stop putting out their own fires long enough to file their taxes. 

4. The OPM breaches have given countries such as China an ideal data set for turning American government agents into double agents. Even if we could protect our “golden keys," what would protect this agency from the ever-fallible human element? What if China succeeds in turning well placed American personnel into double agents? How arrogant must we be, to believe that any high value target would be impenetrable? 

Let’s consider one more wrinkle in this debate. 

So far, the framing by the US government assumes a kind of patriotism and loyalty by US corporations. Yet for how many years have some of the largest US companies maintained headquarters in Dublin, as a way of evading taxes here in the US? Companies are beholden primarily to their shareholders – without national loyalties, but responsible primarily to their bottom lines. What assurances do we have that companies would share their golden keys only with the US? 

And if other countries start to demand their own back doors, it would put US companies in a difficult position. “If we’re going to build defects/back doors or golden master keys for the US government, do you believe we should do so – we have about 1.3 billion users around the world – should we do for the Chinese government, the Russian government, the Saudi Arabian government, the Israeli government, the French government? Which of those countries should we give back doors to?” Stamos asked Rogers earlier this year. 

It’s a complicated problem, with no easy solution. Technologists are the first to admit this. 

But the FBI and the NSA must also concede that any entry into encryption available for their offices would also be an access point that could be abused by the country's adversaries.

To his opponents on this issue, the FBI's Mr. Comey has suggested technologists have simply not tried hard enough to reach a solution. “A whole lot of good people have said it’s too hard … Maybe that’s so,” he told the Intelligence Committee on July 8. “But my reaction to that is: I’m not sure they’ve really tried.” 

To Comey, I say this: Silicon Valley is full of "try."

The technology industry has responded to threats to customers’ security and privacy with agility, by building and implementing new tools to adapt to an ever-changing world.

Encryption is one of these tools. 

Famous technologists such as Stamos, Bruce Schneier, Johns Hopkins University's Matthew Green, and the University of Pennsylvania's Matt Blaze are hardly lazy. 

So, does law enforcement have a difficult job to do? Yes. Would it be helpful to the FBI if they had blanket access to all the information on the planet? Of course. 

But as long as security systems are hackable and humans are fallible, we must safeguard the integrity of software that powers our banking, our business and our medical communities. 

What's more, the fact that encryption is becoming mainstream – with end-to-end encryption now integrated into widely used applications like WhatsApp and iMessage – is a victory for the cybersecurity movement, and for this country’s security going forward. 

We cannot turn back the clock to a time when communications online were not safe. As the standard for secure communication gets higher, law enforcement must respond to this challenge with creativity, agility and above all – realism – about the options available.

It is now law enforcement’s turn to "try harder."

Elissa Shevinsky is a serial entrepreneur and chief executive officer of Jekudo Privacy Company. She's is also the editor of "Lean Out" published by OR Books, and the cofounder of the SecretCon security conference. Follow her on Twitter @ElissaBeth.

 

You've read  of  free articles. Subscribe to continue.
Real news can be honest, hopeful, credible, constructive.
What is the Monitor difference? Tackling the tough headlines – with humanity. Listening to sources – with respect. Seeing the story that others are missing by reporting what so often gets overlooked: the values that connect us. That’s Monitor reporting – news that changes how you see the world.

Dear Reader,

About a year ago, I happened upon this statement about the Monitor in the Harvard Business Review – under the charming heading of “do things that don’t interest you”:

“Many things that end up” being meaningful, writes social scientist Joseph Grenny, “have come from conference workshops, articles, or online videos that began as a chore and ended with an insight. My work in Kenya, for example, was heavily influenced by a Christian Science Monitor article I had forced myself to read 10 years earlier. Sometimes, we call things ‘boring’ simply because they lie outside the box we are currently in.”

If you were to come up with a punchline to a joke about the Monitor, that would probably be it. We’re seen as being global, fair, insightful, and perhaps a bit too earnest. We’re the bran muffin of journalism.

But you know what? We change lives. And I’m going to argue that we change lives precisely because we force open that too-small box that most human beings think they live in.

The Monitor is a peculiar little publication that’s hard for the world to figure out. We’re run by a church, but we’re not only for church members and we’re not about converting people. We’re known as being fair even as the world becomes as polarized as at any time since the newspaper’s founding in 1908.

We have a mission beyond circulation, we want to bridge divides. We’re about kicking down the door of thought everywhere and saying, “You are bigger and more capable than you realize. And we can prove it.”

If you’re looking for bran muffin journalism, you can subscribe to the Monitor for $15. You’ll get the Monitor Weekly magazine, the Monitor Daily email, and unlimited access to CSMonitor.com.

QR Code to Opinion: Why the US government must lose cryptowars 2.0
Read this article in
https://www.csmonitor.com/World/Passcode/2015/0723/Opinion-Why-the-US-government-must-lose-cryptowars-2.0
QR Code to Subscription page
Start your subscription today
https://www.csmonitor.com/subscribe