Wolter/US Navy/Reuters
The Navy's unmanned X-47B aircraft receives fuel from an Omega K-707 tanker plane (not shown) while operating in the Atlantic Test Ranges over the Chesapeake Bay, Maryland, in April.

Musk, Hawking, Chomsky: Why they want a ban on killer robots.

Leading researchers in robotics and artificial intelligence signed an open letter, published Monday, calling for a preemptive ban on autonomous offensive weapons.

A global arms race for killer robots? Bad idea.

That’s according to more than 1,000 leading artificial intelligence (AI) and robotics researchers, who have together signed an open letter, published Monday, from the nonprofit Future of Life Institute.

The letter calls for a ban on autonomous offensive weapons as a means of preventing just such a disaster, and represents the latest word on the global conversation around the risks and benefits of AI weaponry.

Proponents of robotic weapons, such as the Pentagon, say that such technology could increase drone precision, keep troops out of harm’s way, and reduce emotional and irrational decisionmaking on the battlefield, The Christian Science Monitor’s Pete Spotts reported last month.

Critics, however, warn that taking humans out of the equation could lead to human rights violations as well as trouble around international laws governing combat, Mr. Spotts wrote.

The current letter is inclined towards the latter:

If any major military power pushes ahead with AI weapon development, a global arms race is virtually inevitable, and the endpoint of this technological trajectory is obvious: autonomous weapons will become the Kalashnikovs of tomorrow. Unlike nuclear weapons, they require no costly or hard-to-obtain raw materials, so they will become ubiquitous and cheap for all significant military powers to mass-produce.... Autonomous weapons are ideal for tasks such as assassinations, destabilizing nations, subduing populations and selectively killing a particular ethnic group.

“We therefore believe that a military AI arms race would not be beneficial for humanity,” the letter goes on to say.

Among the signatories are renowned physicist Stephen Hawking, Tesla Motors Chief Executive Officer Elon Musk, cognitive scientist Noam Chomsky, and Apple co-founder Steve Wozniak, as well as top AI and robotics experts from the Massachusetts Institute of Technology, Harvard University, Microsoft, and Google.

Dr. Hawking in particular summoned images of the Terminator wreaking havoc on humans when he told the BBC in a 2014 interview, “The development of full artificial intelligence could spell the end of the human race. It would take off on its own, and re-design itself at an ever increasing rate. Humans, who are limited by slow biological evolution, couldn't compete, and would be superseded.”

Others are less dire in their pronouncements.

“We’re not anti-robotics and not even anti-autonomy,” Stephen Goose, one of the signatories and director of arms-control activities at Human Rights Watch, told the Monitor. “We just say that you have to draw a line when you no longer have meaningful human control over the key combat decisions of targeting and attacking.”

The problem is what is meant by “meaningful human control” – an idea that is “intuitively appealing even if the concept is not precisely defined,” according to the United Nations Institute for Disarmament Research.

To further complicate the issue, others point out that a preemptive ban, such as that advocated by the open letter, could close the door to potential for developing AI technology that could save lives.

“It sounds counterintuitive, but technology clearly can do better than human beings in many cases,” Ronald Arkin, an associate dean at the Georgia Institute of Technology in Atlanta whose research focuses on robotics and interactive computing, told the Monitor. “If we are willing to turn over some of our decisionmaking to these machines, as we have been in the past, we may actually get better outcomes."

One thing most experts do agree on is that further debate is critical to determining the future of AI in warfare.

“Further discussion and dialogue is needed on autonomy and human control in weapon systems to better understand these issues and what principles should guide the development of future weapon systems that might incorporate increased autonomy,” wrote Michael Horowitz and Paul Scharre, both from the Center for a New American Security.

You've read  of  free articles. Subscribe to continue.

Dear Reader,

About a year ago, I happened upon this statement about the Monitor in the Harvard Business Review – under the charming heading of “do things that don’t interest you”:

“Many things that end up” being meaningful, writes social scientist Joseph Grenny, “have come from conference workshops, articles, or online videos that began as a chore and ended with an insight. My work in Kenya, for example, was heavily influenced by a Christian Science Monitor article I had forced myself to read 10 years earlier. Sometimes, we call things ‘boring’ simply because they lie outside the box we are currently in.”

If you were to come up with a punchline to a joke about the Monitor, that would probably be it. We’re seen as being global, fair, insightful, and perhaps a bit too earnest. We’re the bran muffin of journalism.

But you know what? We change lives. And I’m going to argue that we change lives precisely because we force open that too-small box that most human beings think they live in.

The Monitor is a peculiar little publication that’s hard for the world to figure out. We’re run by a church, but we’re not only for church members and we’re not about converting people. We’re known as being fair even as the world becomes as polarized as at any time since the newspaper’s founding in 1908.

We have a mission beyond circulation, we want to bridge divides. We’re about kicking down the door of thought everywhere and saying, “You are bigger and more capable than you realize. And we can prove it.”

If you’re looking for bran muffin journalism, you can subscribe to the Monitor for $15. You’ll get the Monitor Weekly magazine, the Monitor Daily email, and unlimited access to CSMonitor.com.

QR Code to Musk, Hawking, Chomsky: Why they want a ban on killer robots.
Read this article in
QR Code to Subscription page
Start your subscription today