Hypersonic missiles may be unstoppable. Is society ready?

In this undated photo distributed by Russian Defense Ministry Press Service, an intercontinental ballistic missile lifts off from a truck-mounted launcher in Russia. Russia has developed a hypersonic weapon, joining China and the U.S. in a race to modernize missile capabilities.

Russian Defense Ministry Press Service/AP/File

March 31, 2020

Humans love things that go fast: race cars, speedboats, and cheetahs. Then there’s hypersonic, which leaves plain old fast blinking in the dust.

On March 19, the United States launched its first successful hypersonic test missile from a naval base in Kauai, Hawaii. The unarmed missile tore through the idyllic Pacific skies at Mach 5, five times faster than the speed of sound. 

The Pentagon aims to test a full hypersonic weapon system by 2023, seeking to draw level with Russia and China, which have touted their own development of hypersonic weapons technology and say they have the hardware to prove it.

Why We Wrote This

A new class of stealth missiles raises ethical questions about how militaries should use machine learning to respond to future threats.

Hypersonic missiles are not just very fast, they are maneuverable and stealthy. This combination of speed and furtiveness means they can surprise an adversary in ways that conventional missiles cannot, while also evading radar detection. And they have injected an additional level of risk and ambiguity into what was already an accelerating arms race between nuclear-armed rivals.  

To understand why, consider the falcon and the albatross. 

In Kentucky, the oldest Black independent library is still making history

The Peregrine falcon is the fastest animal on Earth. From a cruise altitude of more than 3,000 feet, it drills down through the air at 200 mph to snag its prey. Fast.

The sea-faring albatross can soar effortlessly for thousands of miles without a flap of its massive wings, hugging the water’s surface until it abruptly leans into the wind to gain altitude so it can alter its course and swoop down once more. And it does this again and again and again. Maneuverable.

If you could put the two birds together you would have one formidable bird of prey: fast and maneuverable. And that is what hypersonic weapon systems developers have done. There will soon be the capability to launch weapons – conventional or nuclear – that travel faster than the speed of sound and can change course unpredictably and very quickly, making them much harder to track and intercept.

“They (our adversaries) have systems that try to deny our domain dominance,” Mike White, assistant director for hypersonics in the Office of the Under Secretary of Defense for Research and Engineering, told a recent Pentagon press conference. “It really is those threats and those targets that’s driving our investment in hypersonic strike capabilities.” The Defense Department has requested $3.2 billion for hypersonic-related research in the 2021 fiscal budget.  

Crucial role for AI

As the hypersonics race heats up, a long stream of legal, ethical, and diplomatic questions trail in its swift wake, particularly about the critical role that artificial intelligence (AI) plays. This is because of how these systems work together to deliver hypersonic missiles precisely, in theory, to any point on the globe.

A majority of Americans no longer trust the Supreme Court. Can it rebuild?

Unlike conventional missiles, these rockets don’t follow a predictable trajectory. Only complex AI-based sensor systems are capable of detecting and intercepting them. And as the demands on the weaponry grow, so do the concerns about how much humans will have to rely on the AI’s set of ethics – trained by the developer – that inform the system’s “moral” choices.

One of these concerns is the ambiguity factor. In other words, what kind of warhead might be on that incoming hypersonic weapon: nuclear or conventional?

Added to that uncertainty is the short response time, which means that “AI or certainly at the very least automation comes into play” in defending against such missiles, says Douglas Barrie, a senior fellow for military aerospace at the International Institute for Strategic Studies in London.

Even if it’s not all dictated by AI and machine-based learning, he says, “there is going to be an awful lot of automation and that kind of decision chain to deal with these kinds of systems.”

Since AI plays a pivotal role in sensor detection systems, it is central to any ethical debate. Right now, the likelihood of a completely autonomous response to an incoming hypersonic missile seems remote. But that could change, say analysts. For an incoming conventional missile, military commanders may have 30 minutes to detect and respond; a hypersonic missile could arrive at that same destination in 10 minutes or less, forcing a decision faster than seems possible without AI. 

“Those time frames have sped up which would put more pressure on a person,” says Dr. Gordon Cooke, director of research and strategy at West Point. As a society, he adds, we should be thinking about how to view these systems in the future. “What kind of society do we want to have, especially in regards to warfare?”

‘Technology will always fail’

And of course, the question of warfare itself raises its own massive ethical quandary. Some, like ethicist Patrick Lin, view it as a social problem, not a technological one.

“Technology will always fail,” says Dr. Lin, a professor of philosophy at California Polytechnic State University in San Luis Obispo. “That is the nature of technology.” Ethical guidelines, he adds, should be built within the design of the architecture itself so that they’re integral to the system, not an item added on later.

The Sandia National Laboratories is home to Autonomy New Mexico, a network of academics working on autonomy research for national security missions, including hypersonic weapons. What they try to do, says a senior manager, is to make sure that AI systems help the human operators know what the best courses of action to take are in fast-paced scenarios. As to the ethics used, “it’s up to the Department of Defense to come up with how to employ that,” the manager says.

The Kodiak Launch Complex in Kodiak, Alaska is pictured in an undated handout photo from NASA. The U.S. military tested a hypersonic weapon at Kodiak on August 25, 2014, but the mission was aborted after controllers detected a problem with the system, the Pentagon said.
Photo: NASA/Reuters/File

The Pentagon recently released its own ethical guidelines regarding AI research. Its five headings are: Responsible, Equitable, Traceable, Reliable, and Governable. The last makes clear that humans must be able to override an intended AI decision. AI should keep humans “in the loop” by “possessing the ability ... to disengage or deactivate deployed systems that demonstrate unintended behavior,” the guidelines read.

Not just weapons

Dan DeLaurentis, an aeronautics professor who directs the Institute for Global Security and Defense Innovation at Purdue University in Indiana, says AI plays an integral part in the hypersonic system by helping militaries to piece together lots of information about incoming missiles. But any concerns about robots running amok are unfounded, he adds. The Pentagon isn’t out to create “the unsupervised automated killing machines” of Hollywood dystopias.

Then there are thorny questions about the potential weaponization of space, such as the absence of binding treaties to ensure some measure of stability. What arms treaties currently exist between the U.S. and Russia don’t cover hypersonic missiles; legal experts say the U.S. may propose adding them in the future.

“If we were in a situation where the arms control community and a number of treaties were still in place, this would be less troublesome than it is,” says Mr. Barrie. 

Dr. Lin argues that the benefits of hypersonic weapons compared to the risk they create are “widely unclear,” as well as the benefits of the AI systems that inform them.

“I think it’s important to remember that diplomacy works and policy solutions work. ... I think another tool in our toolbox isn’t just to invest in more weapons, but it’s also to invest in diplomacy to develop community.”