Modern field guide to security and privacy

Hacker or spy? In today's cyberattacks, finding the culprit is a troubling puzzle

The Sony hack revealed the challenges of identifying perpetrators of cyberattacks, especially as hackers can masquerade as government soldiers and spies, and vice versa. It's a dangerous new dynamic for foreign relations, especially as what governments know about hackers – and how they know it – remains secret.

|
AP/File

The vigorous debate after the Sony Pictures breach pitted the Obama administration against many of us in the cybersecurity community who didn't buy Washington's claim that North Korea was the culprit. 

What's both amazing – and perhaps a bit frightening – about that dispute over who hacked Sony is that it happened in the first place.

But what it highlights is the fact that we're living in a world where we can’t easily tell the difference between a couple of guys in a basement apartment and the North Korean government with an estimated $10 billion military budget. And that ambiguity has profound implications for how countries will conduct foreign policy in the Internet age.

Clandestine military operations aren’t new. Terrorism can be hard to attribute, especially the murky edges of state-sponsored terrorism. What’s different in cyberspace is how easy it is for an attacker to mask his identity – and the wide variety of people and institutions that can attack anonymously.

In the real world, you can often identify the attacker by the weaponry. In 2006, Israel attacked a Syrian nuclear facility. It was a conventional attack – military airplanes flew over Syria and bombed the plant – and there was never any doubt who did it. That shorthand doesn’t work in cyberspace.

When the US and Israel attacked an Iranian nuclear facility in 2010, they used a cyberweapon and their involvement was a secret for years. On the Internet, technology broadly disseminates capability. Everyone from lone hackers to criminals to hypothetical cyberterrorists to nations’ spies and soldiers are using the same tools and the same tactics. Internet traffic doesn’t come with a return address, and it’s easy for an attacker to obscure his tracks by routing his attacks through some innocent third party. 

And while it now seems that North Korea did indeed attack Sony, the attack it most resembles was conducted by members of the hacker group Anonymous against a company called HBGary Federal in 2011. In the same year, other members of Anonymous threatened NATO, and in 2014, still others announced that they were going to attack ISIS. Regardless of what you think of the group’s capabilities, it’s a new world when a bunch of hackers can threaten an international military alliance.

Even when a victim does manage to attribute a cyberattack, the process can take a long time. It took the US weeks to publicly blame North Korea for the Sony attacks. That was relatively fast; most of that time was probably spent trying to figure out how to respond. Attacks by China against US companies have taken much longer to attribute.

This delay makes defense policy difficult. Microsoft’s Scott Charney makes this point: When you’re being physically attacked, you can call on a variety of organizations to defend you – the police, the military, whoever does antiterrorism security in your country, your lawyers. The legal structure justifying that defense depends on knowing two things: who’s attacking you, and why. Unfortunately, when you’re being attacked in cyberspace, the two things you often don’t know are who’s attacking you, and why.

Whose job was it to defend Sony? Was it the US military’s, because it believed the attack to have come from North Korea? Was it the FBI, because this wasn’t an act of war? Was it Sony’s own problem, because it’s a private company? What about during those first weeks, when no one knew who the attacker was? These are just a few of the policy questions that we don’t have good answers for.

Certainly Sony needs enough security to protect itself regardless of who the attacker was, as do all of us. For the victim of a cyberattack, who the attacker is can be academic. The damage is the same, whether it’s a couple of hackers or a nation-state.

In the geopolitical realm, though, attribution is vital. And not only is attribution hard, providing evidence of any attribution is even harder. Because so much of the FBI’s evidence was classified – and probably provided by the National Security Agency – it was not able to explain why it was so sure North Korea did it. As I recently wrote: “The agency might have intelligence on the planning process for the hack. It might, say, have phone calls discussing the project, weekly PowerPoint status reports, or even Kim Jong-un's sign-off on the plan.” Making any of this public would reveal the NSA’s “sources and methods,” something it regards as a very important secret.

Different types of attribution require different levels of evidence. In the Sony case, we saw the US government was able to generate enough evidence to convince itself. Perhaps it had the additional evidence required to convince North Korea it was sure, and provided that over diplomatic channels. But if the public is expected to support any government retaliatory action, they are going to need sufficient evidence made public to convince them. Today, trust in US intelligence agencies is low, especially after the 2003 Iraqi weapons-of-mass-destruction debacle.

What all of this means is that we are in the middle of an arms race between attackers and those that want to identify them: deception and deception detection. It’s an arms race in which the US – and, by extension, its allies – has a singular advantage. We spend more money on electronic eavesdropping than the rest of the world combined, we have more technology companies than any other country, and the architecture of the Internet ensures that most of the world’s traffic passes through networks the NSA can eavesdrop on.

In 2012, then US Secretary of Defense Leon Panetta said publicly that the US – presumably the NSA – has “made significant advances in ... identifying the origins” of cyberattacks. We don’t know if this means they have made some fundamental technological advance, or that their espionage is so good that they’re monitoring the planning processes. Other US government officials have privately said that they’ve solved the attribution problem.

We don’t know how much of that is real and how much is bluster. It’s actually in America's best interest to confidently accuse North Korea, even if it isn’t sure, because it sends a strong message to the rest of the world: “Don’t think you can hide in cyberspace. If you try anything, we’ll know it’s you.”

Strong attribution leads to deterrence. The detailed NSA capabilities leaked by Edward Snowden help with this, because they bolster an image of an almost-omniscient NSA.

It’s not, though – which brings us back to the arms race. A world where hackers and governments have the same capabilities, where governments can masquerade as hackers or as other governments, and where much of the attribution evidence intelligence agencies collect remains secret, is a dangerous place.

So is a world where countries have secret capabilities for deception and detection deception, and are constantly trying to get the best of each other. This is the world of today, though, and we need to be prepared for it.

Bruce Schneier is a security technologist and author. He is the chief technology officer of Resilient Systems and his latest book is "Data and Goliath: The Hidden Battles to Collect Your Data and Control Your World." He blogs at schneier.com and tweets at @schneierblog.

 

You've read  of  free articles. Subscribe to continue.
Real news can be honest, hopeful, credible, constructive.
What is the Monitor difference? Tackling the tough headlines – with humanity. Listening to sources – with respect. Seeing the story that others are missing by reporting what so often gets overlooked: the values that connect us. That’s Monitor reporting – news that changes how you see the world.

Dear Reader,

About a year ago, I happened upon this statement about the Monitor in the Harvard Business Review – under the charming heading of “do things that don’t interest you”:

“Many things that end up” being meaningful, writes social scientist Joseph Grenny, “have come from conference workshops, articles, or online videos that began as a chore and ended with an insight. My work in Kenya, for example, was heavily influenced by a Christian Science Monitor article I had forced myself to read 10 years earlier. Sometimes, we call things ‘boring’ simply because they lie outside the box we are currently in.”

If you were to come up with a punchline to a joke about the Monitor, that would probably be it. We’re seen as being global, fair, insightful, and perhaps a bit too earnest. We’re the bran muffin of journalism.

But you know what? We change lives. And I’m going to argue that we change lives precisely because we force open that too-small box that most human beings think they live in.

The Monitor is a peculiar little publication that’s hard for the world to figure out. We’re run by a church, but we’re not only for church members and we’re not about converting people. We’re known as being fair even as the world becomes as polarized as at any time since the newspaper’s founding in 1908.

We have a mission beyond circulation, we want to bridge divides. We’re about kicking down the door of thought everywhere and saying, “You are bigger and more capable than you realize. And we can prove it.”

If you’re looking for bran muffin journalism, you can subscribe to the Monitor for $15. You’ll get the Monitor Weekly magazine, the Monitor Daily email, and unlimited access to CSMonitor.com.

QR Code to Hacker or spy? In today's cyberattacks, finding the culprit is a troubling puzzle
Read this article in
https://www.csmonitor.com/World/Passcode/Passcode-Voices/2015/0304/Hacker-or-spy-In-today-s-cyberattacks-finding-the-culprit-is-a-troubling-puzzle
QR Code to Subscription page
Start your subscription today
https://www.csmonitor.com/subscribe