The secret world of vulnerability hunters
search for solutions
Spies, hackers, and cybersecurity firms compete to find and exploit software flaws, often to infiltrate criminal networks or track terrorists. A look into this complex ecosystem.
—By day, John Bambenek is a successful, if pretty ordinary, cybersecurity professional. The 39-year-old father of four spends most of his days trying to safeguard a bevy of corporate clients from malicious hackers. He analyzes digital threats for the Bethesda, Md., firm Fidelis Cybersecurity and runs his own security consultancy from his suburban Champaign, Ill., home. He’s active on LinkedIn and Twitter and once even ran for the Illinois state senate – as a Republican.
But Bambenek is also a digital vigilante. Part of an exclusive but growing group of skilled coders who moonlight for the FBI, he is quietly playing both defense and offense. If federal investigators need technical assistance to pursue targets, Bambenek infiltrates their networks, hacks websites, and stalks their digital footprints. He spends hours hunting for vulnerabilities in commonly used software – and turning them into tools for spying to steal critical and compromising data.
He’s helped the FBI dismantle two crime rings by writing software tools that take advantage of so-called “zero-days,” or unknown vulnerabilities in digital products that manufacturers have not fixed. He says that investigators are currently using one of his surveillance tools to monitor the digital trails of 39 criminal operations.
In one case, in September 2015, Bambenek discovered a flaw in a website where a European criminal software syndicate sold surveillance technology to governments and nefarious hackers. Within minutes, he broke in and uncovered a virtual Rolodex of customers, payments, and products, all of which he turned over to the FBI.
Still, deploying vulnerabilities to hack digital networks, even those used by suspected criminals, raises serious legal questions. The spyware syndicate operation, he admits, is “definitely a gray area ... . Yeah, I work for a security company, but nobody’s protecting Joe Sixpack, and you really can’t protect them. All you can do is just go after the criminals and change the dynamics, hopefully.”
Much of Bambenek’s work is possible because the technology that underpins most of daily life – the websites we all use, the apps on our smartphones, the software on our laptops, and industrial systems running critical infrastructure – is flawed. It’s rife with vulnerabilities. It has bugs and shoddy code. For law enforcement, those flaws might provide a way of tapping a criminal’s smartphone. But for repressive governments, bugs could help them break into activists’ email accounts or spy on their Skype calls.
And in an age in which criminal hackers or intelligence operatives may be able to influence entire democracies by breaching and leaking sensitive emails, such as those in the Democratic National Committee’s mailbox, software flaws are increasingly valuable to both the people trying to break into digital networks and those trying to defend them.
“Much of cybersecurity can be reduced to a constant race between the software developers and security experts trying to discover and patch vulnerabilities, and the attackers seeking to uncover and exploit those vulnerabilities,” says a recent report from New America, a Washington think tank. “There is a wide range of actors seeking to discover security flaws in software, whether to fix them, exploit them, or sell them to someone else who will fix or exploit them.”
The vulnerability marketplace
That demand for vulnerabilities helps drive the lucrative marketplace for malicious software that generates as much as $100 million annually in sales, estimates Trey Herr, a coauthor of the New America report and fellow with the Belfer Center’s Cyber Security Project at the Harvard Kennedy School.
Even though some hackers are selling vulnerabilities back to tech companies in so-called “bug bounty” programs, much of the top-dollar trade in vulnerabilities takes place in the digital underground. “The most expensive, the most interesting transactions ... take place in the darkest corners and the farthest from public view,” he says.
The value of a software defect often depends on how much the seller has weaponized it, and how valuable the targeted system is to a particular mission, according to insiders. An exploit that can crack an iPhone without the victim noticing could be worth double the price of the flaw alone, says Jared DeMott, a former vulnerability exploit analyst at the National Security Agency (NSA) and government contractor Harris Corporation. He now serves as chief technology officer of Binary Defense Systems.
“If you want to maximize your profit, let’s say you have a good bug, like a really juicy bug. It’s a zero-day for the iPhone. That’s super juicy. That’d be worth a ton of money,” says Mr. DeMott.
In August, Apple launched a bug bounty program that would pay researchers as much as $200,000 for finding vulnerabilities in its products. But some third-party bug buyers, who sell software secrets to governments, might shell out more than seven times that amount. Zerodium, one of the most popular online marketplaces for vulnerabilities, is currently offering security researchers $1.5 million for a “fully functional zero-day exploit” that can crack iOS 10, Apple’s mobile operating system.
Because these vulnerabilities often provide the building blocks for hacking tools that government agencies use to carry out espionage activities, former exploit suppliers believe the NSA and other spy agencies around the world are active buyers.
“I have very little doubt that the US government, along with many other governments in the world, buys zero-days on the black market,” says Michael Borohovski, who worked in the intrusion operations division of ManTech International Corp., a government contractor, between 2009 and 2011.
The number of cyberexploits the US spy community buys in a single year can vary depending on the type of intelligence operations underway, say insiders. A single operation might require multiple zero-days.
“One would get you some level of access, the other would escalate access, and so on and so forth,” says Mr. Borohovski. For instance, he said, ManTech held a number of contracts with a range of government agencies where the order “was to perform vulnerability discovery, develop exploits and then weaponize them as needed.”
Often, by the time ManTech tried to deliver a software exploit to the customer, the security vulnerability had already been found, so the contractor would go back to the drawing board and try to find new entryways, he said. On “Patch Tuesday,” the day of the week software developers such as Adobe and Microsoft update commercial software, “sometimes they’ll simply fix the bugs that you spent the last six months weaponizing,” says Borohovski.
“I always call it a cat-and-mouse game between the manufacturers and us. Us being able to find a vulnerability, being able to utilize that to assist law enforcement, before they close the door,” says Lee Reiber, chief operating officer of Oxygen Forensics, one of the many smaller firms in the growing market of phone hacking technology. “Then we start back over again to find it,” hunting for other vulnerabilities and zero-days.
Oxygen Forensics recently helped local police officers extract data from an Android smartphone as part of a kidnapping investigation. The firm cracked the phone in minutes. Across the forensics industry, it typically takes just six hours to physically decrypt a 16GB Android device. A tool known as the “Android Jet Imager” decodes smartphone contents within seven minutes, according to Oxygen.
The large federal defense contractor MITRE licenses the Android decoder to Oxygen. While building the technology, MITRE consulted numerous federal forensics laboratories, federal law enforcement agencies, and Pentagon officials that all stressed a need for “rapid acquisition and analysis of Android devices,” Mark Guido, principal cybersecurity engineer at MITRE, said in an email.
The speed of MITRE’s method relies on, among other things, an algorithm that limits the amount of unnecessary data blocks sent from the smartphone to the receiving computer. “The applications for our approach include crime scenes, border crossings, and any other situation where performing a mobile forensic acquisition is time-sensitive,” according to the Aug. 7, 2016, issue of Journal of Digital Investigation.
Typically, the federal government doesn’t ask a cybersecurity contractor for a bug, say former suppliers. It approaches a contractor with a goal. The objective might be, hypothetically speaking, “we need to get past a certain antivirus tool” or “get into an internet cafe through some particular method,” says Borohovski, now an executive at cyberdefense firm Tinfoil Security.
As the US government’s interest in these kinds of digital operations increases, military contractors have scooped up boutique firms that concentrate on spy technology, industry insiders say. ManTech, for instance, bought exploit researcher Oceans Edge last June for an undisclosed amount and Raytheon purchased bug developer SI Government Solutions in 2008 for an undisclosed sum. What’s more, defense contractors ManTech, Booz Allen Hamilton, Harris, and Raytheon have all landed formal US government contracts to infiltrate targeted software, say former employees there.
ManTech did not respond to requests for comment and the other three corporations declined to discuss offensive cyber contracts or capabilities, most of which are classified.
While controversial, it’s legal to peddle security exploits to most governments, companies and individuals – as long as the seller doesn’t use them. “Selling software itself is not generally a crime,” says Peter Swire, a professor of law and ethics at the Georgia Institute of Technology. Still, he says, “hacking into a protected computer is a crime.”
A blanket US antihacking law, the Computer Fraud and Abuse Act, prohibits breaking into someone else’s computer without consent but doesn’t forbid playing around with code that has the potential to, say, hack a firewall.
What’s more, US intelligence agencies such as the NSA are free to break into computers, phones, and networks to pursue foreign espionage operations. And if local and federal law enforcement agencies such as the Secret Service or the FBI obtain a warrant, they can also use vulnerabilities to hack a suspects’ computer or mobile device.
Taming the vulnerabilities game
But there are limits on where cybersecurity firms can sell their digital exploits. Multinational companies that do business in the US can’t sell military equipment, including offensive software, to certain sanctioned countries such as Iran, Russia, or North Korea.
“Defense companies are regulated in what they can sell abroad, because everything they do is viewed as a defense service and thus restricted under ITAR,” or the International Traffic in Arms Regulations, that requires government approval for an export license, says Ret. Rear Adm. Bill Leigher, a Raytheon cybersecurity director, who has a background in Navy signals intelligence.
As with any tool designed for military and civilian uses, there are dangers of these hacking programs falling into the wrong hands. To be sure, the misuse of government-grade exploits unnerves many civil liberties groups.
“Governments shouldn’t be able to use them to crack down on free speech or dissidents,” says Andrew Crocker, staff attorney for the Electronic Frontier Foundation (EFF), which is suing the Ethiopian government on behalf of a blogger, now residing in Maryland, who alleges his Skype communications were tapped through malware made by German surveillance-tech company Gamma Group.
In an effort to better balance public safety with national security needs, Georgia Tech’s Swire had suggested that decisions involving whether the government should inform tech companies about software flaws not be left solely to intelligence agencies.
Swire served on President Obama’s 2013 Review Group on Intelligence and Communications Technologies, a response to the Edward Snowden leaks about the NSA’s global surveillance operations. He recommended the government weigh the costs and benefits of telling citizens about software holes through a series of multiagency reviews, which is now known as the Vulnerability Equities Process. While still a secretive discussion, the process now includes representation from Commerce and other civilian departments.
The FBI and the Office of the Director of National Intelligence, which oversees the intelligence community, declined to comment for this story.
Even when it has a firm argument for keeping a software defect quiet, the government faces the possibility of its closeted cyberespionage arsenal escaping into the wild. Just this summer there was an online leak of alleged NSA hacking tools by a mysterious group calling itself the “Shadow Brokers.” The US government also identified some of the blueprints for the tools among the immense cache of classified material that ex-Booz Allen contractor Harold Martin allegedly took home from the NSA.
“When governments acquire these things they take a gamble that no one else is going to find out about them or that they won’t be stolen or leaked,” says Mr. Crocker of the EFF. “Now I’m not saying that they should never retain vulnerabilities in the first place, but that that’s a risk that has to be understood.”