Can Tim Cook fight the government and win?
Apple's chief executive officer has promised to appeal a federal court ruling demanding his company help the FBI get around built-in security features on an iPhone used by one of the San Bernardino, Calif., shooters.
But this case, insist US security and privacy experts, is not just about allowing investigators to find out more information from one Apple device seized from a slain terror suspect. Instead, they say, the outcome could affect anyone who uses American tech products because it could open the door for the government to insist on new mechanisms for weakening security on all consumer devices.
"While it seems like this case is only about one iPhone, it’s become the flash point in the fight over encryption and the security of the Internet," says Kevin Bankston, the director of New America's Open Technology Institute. "They are asking a company to build new tools to bypass the security of its own product. That’s a problem that would go far beyond Apple. It’s not even a slippery slope – it’s a sharp cliff."
If a US court can indeed compel Apple to do this, Mr. Bankston continued, "It can likely legally compel any other software providers to do the same thing."
Law enforcement has long complained that the trend of increasingly strong encryption deployed by companies such as Apple and Google is stymieing their pursuit of terrorists and criminals. Apple insists its encryption is a best practice that protects consumers from any unwanted parties, whether they are criminal hackers or snooping governments, and that it does not have the technical means to access the protected data.
The ongoing encryption debate – which has pitted many in the security industry against intelligence and law enforcement officials – may find its first major test case here, as the world’s largest tech company promises to go head-to-head with the US government in a highly sensitive investigation for an attack on US soil.
The FBI says Apple’s built-in iPhone security features have blocked agents from accessing data on the iPhone used by Syed Rizwan Farook, which they believe may contain crucial evidence relating to the December terrorist attack that killed 14 people.
In a win for the government's side, a magistrate judge in Riverside, Calif., ordered Apple to write new software to make it easier for federal agents to access the phone’s data by "brute force," in which investigators could guess the passcode by using programs designed to try millions of combinations quickly. To do this, Apple would have to turn off the feature that would erase the iPhone’s data after 10 incorrect tries.
The Department of Justice says accessing the phone is necessary to fully investigate the San Bernardino attacks. "We have made a solemn commitment to the victims and their families that we will leave no stone unturned as we gather as much information and evidence as possible," said United States Attorney Eileen M. Decker in a statement. "These victims and families deserve nothing less."
Apple has "no sympathy for terrorists" and has helped investigators in this case, Apple’s Mr. Cook said in a public letter. But, he said, building a new version of the iPhone operating system that circumvents key security features and installing it on the seized device would "undeniably create a backdoor" into his products. That’s something, Cook said, "we simply do not have, and something we consider too dangerous to create."
Even though the government suggests this tool could only be used in this case, Cook says, "that’s simply not true. Once created, the technique could be used over and over again, on any number of devices. In the physical world, it would be the equivalent of a master key, capable of opening hundreds of millions of locks – from restaurants and banks to stores and homes. No reasonable person would find that acceptable."
But if the FBI can force Apple to hack into one user's device, worries American Civil Liberties Union staff attorney Alex Abdo in a statement, “then so too can every repressive regime in the rest of the world."
It could be tough for companies to retain the moral high ground on bulletproof security if they cooperate for the US government but refuse similar demands by other countries in which they operate. "Imagine if this was China asking for the same thing? And who is to say they won't ask for the same tomorrow?” says Cris Thomas, a strategist at the cybersecurity firm Tenable who is known by his hacker name Space Rogue.
"Once this 'feature' is created it will not be contained, it will eventually trickle down to local police forces and foreign governments," he said. "The potential for abuse is huge."
Compliance for many companies could also prove enormously cumbersome. A company like Apple might be able to write another, weaker operating system without tanking its business. But Chris Finan, chief executive officer of Manifold Security notes, "We can’t afford to build several versions.”
Like Apple, he says, "We similarly try to build one product that works in international markets and focus on empowering our customers. We never want any suggestion we could undermine their security."
For its part, Apple has five days to respond to the order and detail whether compliance would be "unreasonably burdensome" on the company.
But compliance could cost the company more than money – instead damaging consumers’ trust, especially in the wake of the Edward Snowden revelations that showed widespread government surveillance.
"People around the world will perceive this as the US government bullying US companies into taking its side," says Mr. Finan says. "That can have a huge economic consequences for not just Apple but other US based companies doing business overseas."
Already, rallies organized by progressive digital rights advocacy group Fight for the Future are planned outside Apple stores across the country.
Amie Stepanovich, policy manager at digital rights group Access, worries that if Apple complies with the FBI request, it could lead to government agencies trying to remotely access devices that belong to belonging to criminal suspects or anyone the police are investigating – not just devices it has in custody. This, she says, could potentially compromise citizens' rights.
"It opens a big can of worms,” she says.
At its core, this case brings up an important question of responsibility, says Jon Callas, chief technology officer of encrypted communications firm Silent Circle. Even if law enforcement has a warrant, the constitution does not automatically guarantee that a legal search will be successful, he notes.
"If there's a building full of file cabinets, there's no guarantee [agents] will find what they're looking for even if they know it's there," Mr. Callas says. A warrant, he adds, also "doesn't obligate the target to help search."