Companies’ lawsuits aimed at preventing public disclosures of vulnerabilities will have a chilling effect on security research, a majority of Passcode Influencers said.
Earlier this month, security firm FireEye obtained a court injunction in Germany to prevent another firm, ERNW, from releasing details about vulnerabilities its researchers found in a FireEye product. Along with a recent controversial blog post by the chief security officer of Oracle complaining about researchers who reverse engineer the company’s software (potentially in violation of its terms of service) in the name of finding bugs, this sparked a heated debate over whether protecting companies’ intellectual property should take precedent over researchers’ freedoms to alert consumers to security weaknesses.
In a survey of experts from across government, the tech and security industry, and privacy advocacy community, 74 percent of Passcode’s Influencers voiced concerns about the companies’ legal threats against researchers.
“Lawsuits have already had a chilling effect on security research,” said Tenable Security strategist Cris Thomas (aka Space Rogue). “Numerous researchers have either stopped looking for bugs, or worse, have stopped reporting them. The bugs are still there, possibly being used by the bad guys, but fear of prosecution, for what amounts to telling the truth, stops many researchers from reporting bugs.”
Or, as Charlie Miller — a security lead at Uber’s Advanced Technology Center who recently made headlines for wirelessly hacking a Jeep Cherokee — put it: “Finding vulnerabilities is hard enough without getting sued.”
Passcode’s Influencers Poll is a regular survey of more than 120 experts (listed below) in digital security and privacy. They have the option to comment on the record or anonymously to preserve the candor of their responses.
Potential costs associated with legal action can also be a strong deterrent for security researchers, some Influencers said.
“I have discovered several security vulnerabilities in commercial products, and the first thing I do before any research project is contact a lawyer,” said Matthew Green, cryptologist at Johns Hopkins University. “I need to do this to protect both myself and the graduate students I work with. This dramatically increases the cost of research and makes it much more risky for researchers to perform this essential work. That hurts everyone, since the vulnerabilities don’t go away when researchers are prevented from looking at products — we just don’t find out about them.”
There are costs to stifling security research, some Influencers said. If the vulnerabilities go undisclosed due to the threat of lawsuits, they said, then consumers are put at risk.
“Security research is vital to ensuring the highest level of data security for users,” said Amie Stepanovich, senior policy counsel at digital rights group Access. “Lawsuits brought by companies which impact the ability for researchers to conduct or present that research betray the trust of the users of those companies.”
A 26 percent minority of Influencers, however, said lawsuits will not have a strong negative effect on the security community or its work. Instead of deterring researchers, said Rick Howard, chief security officer of security company Palo Alto Networks, the increased attention could help draw more eyes to a security problem.
“It might stop some security companies from publishing their results, but threatening a lawsuit will not stop researchers from doing what they do,” he said. “In addition, lawsuits have a tendency to poke the bear. You may think you are compelling the research community to stop looking at your product. What you might be doing instead is drawing a bull’s eye on your product. The research community is funny like that. Before, a researcher might have no interest in your product. After a threatened lawsuit, that might convince a certain element in the community to see what all the kerfuffle is about.”
Others think it’s too soon to tell the long-term effects on security research.
“Legal action isn’t a constructive, long-term solution,” one Influencer, who chose to remain anonymous, said. “The Commerce Department is expected to convene a multistakeholder meeting on vulnerability research and disclosure. This powwow will draw together security researchers, academics, companies, and tech vendors to identify areas of common ground. Commerce officials are attempting to increase trust and reduce friction among the various stakeholders — which is a good thing.
“Officials understand that there is no one-size-fits-all solution — IT systems will never be completely bug free — but coordination is crucial," the Influencer continued. "Stakeholders will hopefully begin to develop high-level principles leading to smart policy or best practices that are workable for multiple parties. Such an approach to uncloaking vulnerabilities in predictable ways is a much better way to proceed than litigation.”
What do you think? VOTE in the readers’ version of the Passcode Influencers Poll.
Who are the Passcode Influencers? For a full list, check out our interactive masthead here.
“Lawsuits silencing security researchers rarely get vulnerabilities fixed. It’s time for vendors to step up their maturity when it comes to responding to vulnerability reports, and focus on the engineering and coordination to get vulnerabilities fixed, rather than shooting (or silencing) the messenger.” — Katie Moussouris, HackerOne
“This will impact those who want to look for vulnerabilities. I am concerned this will also have companies ‘hide’ vulnerabilities they cannot patch for longer periods of times, leaving customer networks vulnerable.” — Influencer
“Time and time again, we have seen that openness, transparency, and peer-review are the hallmarks for building the best security possible. A company’s attempts to prevent public disclosure of meaningful vulnerabilities is “security through obscurity” -- it does not work, and leaves us all vulnerable. Enforced public ignorance is never a good solution to security problems and the idea that companies would spend time and money to make us less informed consumers is particularly egregious. Not only should companies disclose their vulnerabilities, they should be mandated to disclose these problems since it is the best way for us, their clients, to make informed decisions about which companies we wish to do business with.” — Sascha Meinrath, X-Lab
“Lawsuits have a chilling effect. Access to the particulars of vulnerabilities must be granted in order for security research and good-faith testing. There should be transparency and information available to those within the white-hat security field for the identification of, disclosure of, and repair of code errors/malfunctions, security flaws, or vulnerabilities.” — Influencer
“Yes, but not nearly as much as the extension of the Wassenaar rules to cyberspace will.” — Martin Libicki, RAND Corp.
“This recent joint statement on the chilling effect on security researchers, of which I’m one of many signatories, lays out my concerns (and those of dozens of other computer scientists and policy experts) quite nicely.” — Kevin Bankston, Open Technology Institute
“Security researchers should engage in responsible disclosure, but these types of lawsuits stifle the publication of useful information that consumers and businesses need to keep their computer systems secure. A number of states, and the federal government, are considering legislation that would better protect individuals from lawsuits designed to hamper legitimate public speech. Security researcher may be one of the beneficiaries of such legislation.” — Daniel Castro, Information Technology and Innovation Foundation
“The threat of litigation often inhibits researchers from disclosing a vulnerability even to the companies that may be responsible for the vulnerability, or to companies that have unknowingly incorporated the vulnerability in their products and services. The threat of litigation may also chill information sharing within the larger research community, which negatively impacts our understanding and response to cybersecurity threats.” — Nuala O’Connor, Center for Democracy and Technology
“Not only do lawsuits to prevent public disclosures of vulnerabilities have a chilling effect on security research, but they almost guarantee a grotesque financial advantage to criminal groups that prey on users running insecure software, and to the vendors, who will have little incentive to fix issues that are understood by criminals, nation states and security researchers but not the general public. Security researchers who identify and exploit software bugs play a little understood but absolutely crucial role in the software ecosystem, and hindering their efforts is a lose-lose proposition for consumers and researchers alike.” — Nick Selby, StreetCred Software
“I’ve advised hundreds of security researchers and for many, especially those connected with large institutions, educational or otherwise, even the possibility of a lawsuit has been enough to chill important research. Yet our security is too important to be left to those willing to withstand litigation; we desperately need more secure systems. Laws like the CFAA and the DMCA anti-circumvention provisions and the overuse of restrictive contractual arrangements has left security research too often in a legal gray zone. It’s time we supported people who were trying to find out if our tools are safe rather than forcing them to run legal risks.” — Cindy Cohn, Electronic Frontier Foundation
“As a researcher myself, I’ve personally experienced this.” — Influencer
“Responsible vulnerability disclosure underpins the entire security ecosystem. Lawsuits against researchers who provide pre-release notification only encourage them to engage in uncoordinated full disclosure or drive them out of the industry altogether. Companies have a duty to their customers – and, in many cases, to the public – to remedy security holes in their software, and they should do so as expeditiously as possible, which responsible disclosure facilitates. The choice for businesses is clear: mitigate don’t litigate!” — Rep. Jim Langevin (D) of Rhode Island
“The effect is perhaps than it was ten years ago, but any efforts to suppress will have some chilling effect. But now, the security research community has far more tools at its disposal (not least publicity at Wired) to end-around such pressure.” — Jay Healey, Columbia University’s School of International and Public Affairs
“As more and more commodity products include embedded computing platforms, we find ourselves at a turning point. If these manufacturers of the “Things” connected to the Internet of same choose to employ an adversarial approach, the cumulative risk and negative impacts on the Internet ecosystem will continue to grow. The computer software product world has been down this road before. While there are rational reasons for a vendor’s impulse to attempt to suppress information on security flaws in their products, these instincts do not usually serve the long term interests of the vendor, their customers, or others who might interact with both. Healthy, cooperative coordination of vulnerability disclosures between security researchers and vendors is the optimal case. In the IT/desktop software world, these processes, while not perhaps perfect, have evolved to include a recognition of incentives (which are not always monetary), and appear to have materially improved the time-to-fix for flaws that in prior years had sometimes quietly languished within shipping products, exposing users to attacks for years.” — Bob Stratton, MACH37
“Lawsuits will always have a chilling effect on any activity that is subject to lawsuit. And of course, there will be numerous perverse effects, such as driving security research away from those groups or companies that would conduct themselves legally and ethically, and towards groups or entities that will not be bound by such prosaic considerations. It’s high time Congress pass information sharing legislation that will enable appropriate and effective sharing of vulnerability information.” — Influencer
“You tacitly assume that said lawsuits will either have an outcome favorable to the plaintiffs or are the working equivalent of a [Strategic lawsuit against public participation] suit. Under that assumption, the answer to the question is that there will be no chilling of security research done by real enemies, i.e., the ones who don’t share their results. Such suits are more like a screen door -- keeping out the harmless and the annoying, but irrelevant for any mortal dangers that come to the doorstep.” — Dan Geer, In-Q-Tel
“Don’t get me wrong: it’s possible that some research that would have otherwise been done, would not be done as a result. But this is hardly going to be the most important constraint on security research – that is about people and dollars. I’d spend my time fighting more important battles than this.” — Influencer
“It’s obvious by the timing and content of the question that this week’s poll is prompted by FireEye’s recent controversy. Two corrections to the question are needed, based on the facts as reported in this blog post. 1. FireEye didn’t pursue a “lawsuit” against anyone. FireEye sought an interim injunction in a German court to prevent release of FireEye’s intellectual property. 2. FireEye wasn’t “preventing public disclosures of vulnerabilities.” FireEye did not seek to deny ERNW from disclosing the vulnerabilities themselves. FireEye cooperated with ERNW and approved their published report on the vulnerabilities.” — Richard Bejtlich, FireEye
“No, but it will mean that security researchers are more circumspect with notifications, which will be a net loss for the good guys.” — Influencer
“The market has spoken — vulnerabilities in software requiring frequent/constant patching is costing enterprises millions of dollars. The leading software vendors ARE NOT the ones talking about lawsuits or trying to squelch responsible vulnerability disclosures. In the long term, the same will be true in FireEye’s market of IPS appliances.” — John Pescatore, SANS Institute
“I don’t think ‘a chilling effect’ is necessarily the right way to characterize the security research end state. What I do see is that these lawsuits might have a motivating effect on security researchers to find vulnerabilities, report them to the responsible party for fixing them, and then work an arrangement to get publicly recognized for finding the flaw after it is fixed and deployed.” — Influencer
“Public disclosure or not, there’s no lid going back on the cybersecurity challenge.” — Chris Young, Intel
“Hopefully it will prompt legislation and resolve issues being forced into court.” — Influencer
“History has shown us that the security research community reacts poorly to what they consider corporate bullying.” — Amit Yoran, RSA
“They have always pulled that stuff and it just makes them look bad.” — Influencer