Should Silicon Valley be liable for cybersecurity?
The global ransomware attack that affected an estimated 300,000 computers in 150 countries is tied to a Windows vulnerability. Do Microsoft and other software makers bear responsibility for keeping products secure or is it up to users?
When automakers have shipped cars with bad brakes, they’ve faced multimillion-dollar government fines. Appliance companies have paid hefty legal settlements for selling flawed coffee pots. And the government brought criminal charges against food executives for contaminated peanut butter.
But the multibillion-dollar US software industry has so far been immune to civil or criminal liability for serious – and growing – problems that result from bad code. When it comes to keeping computer systems safe from malware and viruses, warding off criminal hackers, or simply updating buggy programs, it largely rests on consumers to keep systems safe even when the underlying technology may be flawed.
Now that a so-called “ransomware” attack has affected more than 300,000 computers worldwide, according to US officials, encrypting thousands of victims’ data until they pay ransoms to unlock files, cybersecurity experts are asking whether software makers should be held to a standard similar to other industries as a way of ensuring their products are safer from serious and costly computer attacks.
“The solution is going to be regulation. We need to change the incentives right now,” says Bruce Schneier, a noted cryptographer and chief technology officer at IBM Resilient. “We've picked fast and cheap. Wait until this happens to your car, or your refrigerator, or airplane avionics, or when your internet-enabled lock has locked you out.”
Indeed, says Mr. Schneier and many other technology experts, while the worst software vulnerabilities may have allowed malicious hackers to cripple business and government systems or compromise sensitive personal data, cyberattacks may soon have more costly consequences since software is increasingly embedded into automobiles, medical devices, utilities, and other critical systems. Therefore, experts say, there’s a growing urgency to ensure faulty code can’t be so easily exploited or manipulated.
When a patch isn't enough
Typically, software companies will alert their users when they detect a vulnerability in their products and push out a software update that fixes the security hole. That’s what Microsoft did when it learned about a serious flaw in Windows that could give criminals an opening to execute a ransomware attack.
Yet it’s unclear if that message reached all the victims targeted in Friday’s attack, which continued through the weekend. This particular strain of ransomware called WannaCry (also known as WannaCrypt or WanaDecrypt) appears to have spread through a malicious email campaign that installed the virus on victims’ computers through attachments. According to the White House, the criminals behind WannaCry have made off with less than $70,000 from victims.
The Windows software exploit that WannaCry utilized surfaced on internet forums as part of a cache of cyberweapons linked to the National Security Agency.
A hacker group calling itself the Shadow Brokers dumped the spyware on the web earlier this year. In a blog post Sunday, Microsoft President Brad Smith called out the NSA for stockpiling such digital vulnerabilities, equating the problem to the US government “having some of its Tomahawk missiles stolen.”
But some experts are less enthusiastic about chastising spy agencies, who have long taken advantage of software flaws in operating systems and mobile phones to spy on their targets. “It’s unfair to single out the NSA," says Patrick Wardle, a computer expert who worked at the NSA and now serves as chief security researcher at the firm Synack. "Why aren’t we blaming Microsoft? They developed and deployed buggy code. They should take some share of the blame.”
Software makers held to different standards
Unlike many other industries such as health care or electronics, software makers aren’t subject to the same legal standards when it comes to product safety. In a series of New Republic articles in 2013 on the software liability debate, the Hoover Institution’s Jane Chong says that software companies have traditionally avoided any claims of liability over faulty code because the user agreements. “Software providers typically shunt all the risks associated with their products onto users through these license agreements, which the courts have generally treated as enforceable contracts,” wrote Ms. Chong, a national security and law associate at the institution.
When users have tried to sue software companies over data breaches, the cases are often thrown out of court, she noted. For instance, a California court dismissed a class action case by LinkedIn users who alleged the social media company was the victim of a major hack because it didn’t take industry standard security precautions.
For courts to begin holding software companies responsible for cybersecurity lapses would take tougher federal regulations when it comes to the quality of code. It would also take judges who can understand the complex issues around software vulnerabilities and how those can lead to cyberattacks.
In this case, some of the Microsoft Windows systems affected were old versions that hadn’t been updated or patched, noted Ross Schulman, codirector of the Cybersecurity Initiative and senior policy counsel at New America's Open Technology Institute. The company has already supported those systems “for a really long time compared to best practices; they gave everybody ample warning that they were going to stop supporting it.”
What’s more, many experts have noted, Microsoft did act responsibly in this case and alerted its customer about the vulnerability. Instead of blaming Microsoft, says Tom Cross, chief technology officer for the cybersecurity firm OPAQ, "regulators should be asking why certain organizations were not prepared, particularly if those organizations are in critical infrastructure sectors.”
US officials are beginning to take steps to deal with the spread of the WannaCry, with victims including a handful of US companies, such as FedEx. At Tuesday's White House press briefing, White House Homeland Security adviser Tom Bossert said intelligence agencies are engaged in an ongoing investigation into the hacks – but don't yet know who's behind them.
"This was not a – a tool developed by the NSA to hold ransom data," Mr. Bossert said. "This was a tool developed by culpable parties, potentially criminals or foreign nation-states, that have put it together in such a way so that they deliver it with phishing e-mails; put it into embedded documents; and cause infection, encryption, and locking."
As experts and officials attempt to unravel who was behind the attack, it could also be a moment for the industry and government to reevaluate whether there's a way to encourage software companies to ship products with code that's more secure and resilient against these kinds of attacks, says Joshua Corman, director of the Cyber Statecraft Initiative at the Atlantic Council, a Washington think tank.
"I certainly think it’s a watershed moment,” says Mr. Corman. “There will be a much clearer case for the argument for some form of software liability.... I’d love to see some sort of trigger for corrective action."