Subscribe
Modern field guide to security and privacy

Machines v. hackers: Cybersecurity's artificial intelligence future

The US is short hundreds of thousands of information security professionals. But that gap is driving investments in artificial intelligence that may make armies of cybersecurity workers unnecessary.

of

It's a common refrain after any recent high-profile breach into federal computers and corporate networks: There aren't enough skilled cybersecurity professionals to outwit criminal hackers.

That message from officials, executives, and industry experts isn't just grousing, either. According to industry estimates, the US needs about 200,000 more workers to fill current cybersecurity roles. Globally, the gap is five times higher – an estimated 1 million workers.

The issue has become such a priority that President Obama made increasing the number of cybersecurity workers a key component of his multibillion-dollar Cybersecurity National Action Plan, which was introduced earlier this year. The White House said earlier this month it plans on boosting the federal cybersecurity workforce by 3,500 new hires by year's end.

But as businesses compete for scarce cybersecurity talent and policymakers weigh remedies for the digital security worker shortage, the ground underneath the profession is shifting.

Now, computers equipped with sophisticated learning algorithms are performing jobs that until recently required highly trained humans. Over time, experts say, the complexity of cybersecurity jobs performed by machines will increase, further reducing the demand for workers and changing the entire nature of cybersecurity work.

"If we fast forward … I think we will see a diminished role for humans," says Amir Husain, an authority on artificial intelligence and chief executive officer of SparkCognition, a startup focused artificial intelligence. 

In fact, Mr. Husain and others note, the use of artificial intelligence to do information security work is already happening. For example, antivirus companies have long relied on algorithms – not humans – to determine whether a given file is malicious or not, based on patterns identified in previous malicious files. 

"Except in very rare cases, where you have an unknown threat, humans are not doing file analysis," he says.

Much of the investment that's going into the cybersecurity space to fuel the development of automation is directed at responding to cybersecurity incidents. Currently, humans are the ones who figure out how to respond to cyberattacks on networks, working to quickly block suspicious communications and analyze malicious behavior and software. But computers could perform the same functions -- and do it much more quickly than people behind the keyboard.

But computers could perform the same functions -- and do it much more quickly than people behind the keyboard. 

In fact, the allure of machines quickly fixing vulnerabilities has led the Defense Advanced Research Projects Agency (DARPA), the Defense Department's technology lab, to organize the first-ever hacking competition that pits automated supercomputers against each other at next month's Black Hat cybersecurity conference in Las Vegas.

With the contest, DARPA is aiming to find new ways to quickly identify and eliminate software flaws that can be exploited by hackers, says DARPA program manager Mike Walker.

“We want to build autonomous systems that can arrive at their own insights, do their own analysis, make their own risk equity decisions of when to patch and how to manage that process,” said Walker. 

Technology firms large and small are already moving toward that goal. In May, IBM announced plans to train a new, cloud-based version of its Watson cognitive technology to detect cyberattacks and computer crimes. As part of its training, IBM fed Watson a dictionary of information security-specific terms such as "exploit" and "dropper" and programmed it how to identify and respond to cybersecurity incidents.

Of course, cybersecurity isn’t the only work that will be affected by artificial intelligence and automation. A recent analysis by the consulting firm McKinsey concluded that automation will "affect portions of almost all jobs to a greater or lesser degree, depending on the type of work they entail."

That study analyzed more 2,000 work activities across 800 different occupations and concluded that automation of work is already going beyond routine manufacturing activities and has the potential to transform sectors that "involve a substantial share of knowledge work."

Though the McKinsey study did not look at the field of information security specifically, aspects of it work would seem to make it an industry ripe for automation.

Much information security work boils down to picking needles of useful or important information out of a haystack of unimportant data – from network traffic to log messages generated by different products.

"It’s hunting," said John Pescatore, director of emerging security trends at the SANS Institute, a leading training organization for the information security sector. "You’re looking around your infrastructure and studying [network traffic] for machines that are talking to some [Internet] address or region that your network hasn’t talked to before."

Today, that work is inefficient and time consuming. IBM has reported that the average organization is presented with more than 200,000 “pieces of security event data” each day. Responding to “false positives” in that data is a huge and costly problem for organizations of all types.

The best security analysts are able to cancel out some of that noise and isolate unusual patterns that are suggestive. And, as Passcode recently reported, startups like PatternEx are already working on ways to use artificial intelligence to stem the flow of alerts to human operators, giving them the ability to do deeper analysis of a smaller number of suspicious incidents.

But data collection and data processing are two tasks that McKinsey’s study identified as the most susceptible to being automated. And refinements in artificial intelligence sometimes referred to as "deep learning" increasingly give machines the ability to mimic human intuition – a "sixth sense" that sees patterns others miss, said Husain of SparkCognition.

"Insofar as machines can sense and monitor the world in ways that go beyond our biological abilities, they will have greater insight – higher quality insight with more depth," he said.

But even though automation may play a more crucial role in improving digital defenses, humans will remain part of the picture – at least for the foreseeable future.

"There’s a huge need right now in the workforce and I don’t see that diminishing,” says Richard Forno, assistant director at the University of Maryland’s Center for Cybersecurity. "We have 10,000 or 12,000 open [positions] for security folks – and that’s just one state."

Jack Detsch contributed reporting.

About these ads
Sponsored Content by LockerDome
 

We want to hear, did we miss an angle we should have covered? Should we come back to this topic? Or just give us a rating for this story. We want to hear from you.

Loading...

Loading...

Loading...

Save for later

Save
Cancel

Saved ( of items)

This item has been saved to read later from any device.
Access saved items through your user name at the top of the page.

View Saved Items

OK

Failed to save

You reached the limit of 20 saved items.
Please visit following link to manage you saved items.

View Saved Items

OK

Failed to save

You have already saved this item.

View Saved Items

OK