Modern field guide to security and privacy

Opinion: How to defuse a simmering crypto war

In an Op-Ed provided by our partners at the Information Technology and Innovation Foundation, the director of the Cyber Security Policy and Research Institute at the George Washington University argues that engineering trust can help avoid a new battle over data encryption.

Reuters
Customers at the opening of an Apple Store in China.

Watch a debate on balancing national security with privacy featuring White House Cybersecurity Coordinator Michael Daniel, Lance Hoffman, and others live on CSMonitor.com starting at 9 a.m. Thursday, March 12. If you are in Washington, you can register for the event here

What is a nation's optimum cryptography policy? In 1993, the US government proposed it be allowed back-door access to communications via an encryption chip – the Clipper chip – built into computer systems. Under suitable conditions, the government would be able to decrypt any communication and thus thwart criminal activity. That proposal did not sit well with privacy advocates, civil libertarians, and others who saw it as overreach on the government’s part and a serious infringement on civil liberties.

After the Edward Snowden leaks and subsequent efforts by Google, Apple, and others to build strong encryption that makes it almost impossible for governments to break into their products, the federal government has reopened the crytography discussion.

Many of the same issues that arose in the Clipper chip debate are being raised now – and many of the same solutions are being proposed, this time by the Obama administration. The tensions between national security, law enforcement, and civil liberties are now more obvious in a post-9/11 world, but the basic question is not new. Controlling the actions of people in positions of power was discussed by Plato in "The Republic."

A system with a back door built in is also a system with a built-in vulnerability, exploitable by friends as well as foes. Sometimes even systems thought to be secure have bugs that aren’t discovered or fixed for years, such as the recently discovered FREAK exploit. Introduced in the 1990s for compliance with US cryptography export regulations, it affected several popular web browsers.

Governments should examine the political, economic, and social costs of effectively mandating insecure operating systems, hardware, and standards. Witness governments such as Germany demanding “more secure” systems built in their own countries (with personal data kept there also). Apple and Google and other companies hear this. They know that trust, once lost, is hard to rebuild.

Technological responses by themselves are not solutions. But they can provide part of the solution to the puzzle, just as encryption already does in verifying nuclear test-ban treaty compliance. For example, technological methods exist that require a majority of "trusted" parties to agree before a key is made available. These trusted parties could include the phone manufacturer, the police, a civil liberties organization, a news organization, and others. Just how many parties should be involved and who they should be are difficult questions that require many viewpoints – global viewpoints – to be considered. 

And here is where much more dialogue among technologists, lawyers, and policymakers should be encouraged. Cross-disciplinary thought is woefully underfunded; that’s one reason for the current chaotic state of affairs. Research by independent parties on the economic, political, and social costs and benefits of surveillance mechanisms, not only on the mechanisms themselves, should provide a starting point and will help realize the Internet’s potential for building bridges, as opposed to barriers, between cultures.

Lance J. Hoffman is director of the Cyber Security Policy and Research Institute at The George Washington University in Washington. Follow him on Twitter @LanceHoffman1.

You've read  of  free articles. Subscribe to continue.

Dear Reader,

About a year ago, I happened upon this statement about the Monitor in the Harvard Business Review – under the charming heading of “do things that don’t interest you”:

“Many things that end up” being meaningful, writes social scientist Joseph Grenny, “have come from conference workshops, articles, or online videos that began as a chore and ended with an insight. My work in Kenya, for example, was heavily influenced by a Christian Science Monitor article I had forced myself to read 10 years earlier. Sometimes, we call things ‘boring’ simply because they lie outside the box we are currently in.”

If you were to come up with a punchline to a joke about the Monitor, that would probably be it. We’re seen as being global, fair, insightful, and perhaps a bit too earnest. We’re the bran muffin of journalism.

But you know what? We change lives. And I’m going to argue that we change lives precisely because we force open that too-small box that most human beings think they live in.

The Monitor is a peculiar little publication that’s hard for the world to figure out. We’re run by a church, but we’re not only for church members and we’re not about converting people. We’re known as being fair even as the world becomes as polarized as at any time since the newspaper’s founding in 1908.

We have a mission beyond circulation, we want to bridge divides. We’re about kicking down the door of thought everywhere and saying, “You are bigger and more capable than you realize. And we can prove it.”

If you’re looking for bran muffin journalism, you can subscribe to the Monitor for $15. You’ll get the Monitor Weekly magazine, the Monitor Daily email, and unlimited access to CSMonitor.com.