Hypersonic missiles may be unstoppable. Is society ready?

|
Russian Defense Ministry Press Service/AP/File
In this undated photo distributed by Russian Defense Ministry Press Service, an intercontinental ballistic missile lifts off from a truck-mounted launcher in Russia. Russia has developed a hypersonic weapon, joining China and the U.S. in a race to modernize missile capabilities.
  • Quick Read
  • Deep Read ( 5 Min. )

Hypersonic represents a new frontier of missile warfare: fast, stealthy, and unpredictable in flight. The U.S. recently tested a prototype that puts it in a race with China and Russia to claim a capability that adds another layer of uncertainty to geopolitical competition, not least because of the complex computational systems on which hypersonic weapons rely.

Put simply, the assumptions of conventional missile warfare – that incoming attacks can be tracked and intercepted, and a proportionate response be weighed – don’t transfer easily to hypersonic weapons because they are so fast and stealthy. That means a greater reliance on artificial intelligence to track and respond, raising ethical questions about how such systems are programmed.

Why We Wrote This

A new class of stealth missiles raises ethical questions about how militaries should use machine learning to respond to future threats.

Even if it’s not all dictated by AI, “there is going to be an awful lot of automation and that kind of decision chain to deal with these kinds of systems,” says Douglas Barrie, a military aerospace analyst in London.

This technology raises profound ethic and legal dilemmas, says Patrick Lin, an ethicist, who says society must consider the risk and rewards of having such awesome weaponry – and whether other policy tools should come first. “I think it’s important to remember that diplomacy works and policy solutions work.” 

Humans love things that go fast: race cars, speedboats, and cheetahs. Then there’s hypersonic, which leaves plain old fast blinking in the dust.

On March 19, the United States launched its first successful hypersonic test missile from a naval base in Kauai, Hawaii. The unarmed missile tore through the idyllic Pacific skies at Mach 5, five times faster than the speed of sound. 

The Pentagon aims to test a full hypersonic weapon system by 2023, seeking to draw level with Russia and China, which have touted their own development of hypersonic weapons technology and say they have the hardware to prove it.

Why We Wrote This

A new class of stealth missiles raises ethical questions about how militaries should use machine learning to respond to future threats.

Hypersonic missiles are not just very fast, they are maneuverable and stealthy. This combination of speed and furtiveness means they can surprise an adversary in ways that conventional missiles cannot, while also evading radar detection. And they have injected an additional level of risk and ambiguity into what was already an accelerating arms race between nuclear-armed rivals.  

To understand why, consider the falcon and the albatross. 

The Peregrine falcon is the fastest animal on Earth. From a cruise altitude of more than 3,000 feet, it drills down through the air at 200 mph to snag its prey. Fast.

The sea-faring albatross can soar effortlessly for thousands of miles without a flap of its massive wings, hugging the water’s surface until it abruptly leans into the wind to gain altitude so it can alter its course and swoop down once more. And it does this again and again and again. Maneuverable.

If you could put the two birds together you would have one formidable bird of prey: fast and maneuverable. And that is what hypersonic weapon systems developers have done. There will soon be the capability to launch weapons – conventional or nuclear – that travel faster than the speed of sound and can change course unpredictably and very quickly, making them much harder to track and intercept.

“They (our adversaries) have systems that try to deny our domain dominance,” Mike White, assistant director for hypersonics in the Office of the Under Secretary of Defense for Research and Engineering, told a recent Pentagon press conference. “It really is those threats and those targets that’s driving our investment in hypersonic strike capabilities.” The Defense Department has requested $3.2 billion for hypersonic-related research in the 2021 fiscal budget.  

Crucial role for AI

As the hypersonics race heats up, a long stream of legal, ethical, and diplomatic questions trail in its swift wake, particularly about the critical role that artificial intelligence (AI) plays. This is because of how these systems work together to deliver hypersonic missiles precisely, in theory, to any point on the globe.

Unlike conventional missiles, these rockets don’t follow a predictable trajectory. Only complex AI-based sensor systems are capable of detecting and intercepting them. And as the demands on the weaponry grow, so do the concerns about how much humans will have to rely on the AI’s set of ethics – trained by the developer – that inform the system’s “moral” choices.

One of these concerns is the ambiguity factor. In other words, what kind of warhead might be on that incoming hypersonic weapon: nuclear or conventional?

Added to that uncertainty is the short response time, which means that “AI or certainly at the very least automation comes into play” in defending against such missiles, says Douglas Barrie, a senior fellow for military aerospace at the International Institute for Strategic Studies in London.

Even if it’s not all dictated by AI and machine-based learning, he says, “there is going to be an awful lot of automation and that kind of decision chain to deal with these kinds of systems.”

Since AI plays a pivotal role in sensor detection systems, it is central to any ethical debate. Right now, the likelihood of a completely autonomous response to an incoming hypersonic missile seems remote. But that could change, say analysts. For an incoming conventional missile, military commanders may have 30 minutes to detect and respond; a hypersonic missile could arrive at that same destination in 10 minutes or less, forcing a decision faster than seems possible without AI. 

SOURCE:

The Economist

|
Jacob Turcotte/Staff

“Those time frames have sped up which would put more pressure on a person,” says Dr. Gordon Cooke, director of research and strategy at West Point. As a society, he adds, we should be thinking about how to view these systems in the future. “What kind of society do we want to have, especially in regards to warfare?”

‘Technology will always fail’

And of course, the question of warfare itself raises its own massive ethical quandary. Some, like ethicist Patrick Lin, view it as a social problem, not a technological one.

“Technology will always fail,” says Dr. Lin, a professor of philosophy at California Polytechnic State University in San Luis Obispo. “That is the nature of technology.” Ethical guidelines, he adds, should be built within the design of the architecture itself so that they’re integral to the system, not an item added on later.

The Sandia National Laboratories is home to Autonomy New Mexico, a network of academics working on autonomy research for national security missions, including hypersonic weapons. What they try to do, says a senior manager, is to make sure that AI systems help the human operators know what the best courses of action to take are in fast-paced scenarios. As to the ethics used, “it’s up to the Department of Defense to come up with how to employ that,” the manager says.

Photo: NASA/Reuters/File
The Kodiak Launch Complex in Kodiak, Alaska is pictured in an undated handout photo from NASA. The U.S. military tested a hypersonic weapon at Kodiak on August 25, 2014, but the mission was aborted after controllers detected a problem with the system, the Pentagon said.

The Pentagon recently released its own ethical guidelines regarding AI research. Its five headings are: Responsible, Equitable, Traceable, Reliable, and Governable. The last makes clear that humans must be able to override an intended AI decision. AI should keep humans “in the loop” by “possessing the ability ... to disengage or deactivate deployed systems that demonstrate unintended behavior,” the guidelines read.

Not just weapons

Dan DeLaurentis, an aeronautics professor who directs the Institute for Global Security and Defense Innovation at Purdue University in Indiana, says AI plays an integral part in the hypersonic system by helping militaries to piece together lots of information about incoming missiles. But any concerns about robots running amok are unfounded, he adds. The Pentagon isn’t out to create “the unsupervised automated killing machines” of Hollywood dystopias.

Then there are thorny questions about the potential weaponization of space, such as the absence of binding treaties to ensure some measure of stability. What arms treaties currently exist between the U.S. and Russia don’t cover hypersonic missiles; legal experts say the U.S. may propose adding them in the future.

“If we were in a situation where the arms control community and a number of treaties were still in place, this would be less troublesome than it is,” says Mr. Barrie. 

Dr. Lin argues that the benefits of hypersonic weapons compared to the risk they create are “widely unclear,” as well as the benefits of the AI systems that inform them.

“I think it’s important to remember that diplomacy works and policy solutions work. ... I think another tool in our toolbox isn’t just to invest in more weapons, but it’s also to invest in diplomacy to develop community.” 

You've read  of  free articles. Subscribe to continue.
Real news can be honest, hopeful, credible, constructive.
What is the Monitor difference? Tackling the tough headlines – with humanity. Listening to sources – with respect. Seeing the story that others are missing by reporting what so often gets overlooked: the values that connect us. That’s Monitor reporting – news that changes how you see the world.

Dear Reader,

About a year ago, I happened upon this statement about the Monitor in the Harvard Business Review – under the charming heading of “do things that don’t interest you”:

“Many things that end up” being meaningful, writes social scientist Joseph Grenny, “have come from conference workshops, articles, or online videos that began as a chore and ended with an insight. My work in Kenya, for example, was heavily influenced by a Christian Science Monitor article I had forced myself to read 10 years earlier. Sometimes, we call things ‘boring’ simply because they lie outside the box we are currently in.”

If you were to come up with a punchline to a joke about the Monitor, that would probably be it. We’re seen as being global, fair, insightful, and perhaps a bit too earnest. We’re the bran muffin of journalism.

But you know what? We change lives. And I’m going to argue that we change lives precisely because we force open that too-small box that most human beings think they live in.

The Monitor is a peculiar little publication that’s hard for the world to figure out. We’re run by a church, but we’re not only for church members and we’re not about converting people. We’re known as being fair even as the world becomes as polarized as at any time since the newspaper’s founding in 1908.

We have a mission beyond circulation, we want to bridge divides. We’re about kicking down the door of thought everywhere and saying, “You are bigger and more capable than you realize. And we can prove it.”

If you’re looking for bran muffin journalism, you can subscribe to the Monitor for $15. You’ll get the Monitor Weekly magazine, the Monitor Daily email, and unlimited access to CSMonitor.com.

QR Code to Hypersonic missiles may be unstoppable. Is society ready?
Read this article in
https://www.csmonitor.com/USA/Military/2020/0331/Hypersonic-missiles-may-be-unstoppable.-Is-society-ready
QR Code to Subscription page
Start your subscription today
https://www.csmonitor.com/subscribe