Skip to: Content
Skip to: Site Navigation
Skip to: Search


Unmanned drone attacks and shape-shifting robots: War's remote-control future

The Pentagon already includes unmanned drone attacks in its arsenal. Next up: housefly-sized surveillance craft, shape-changing 'chemical robots,' and tracking agents sprayed from the sky. What does it mean to have soldiers so far removed from the battlefield?

(Page 6 of 7)



With the advent of the US wars in Iraq and Afghanistan, however, technology has once again rendezvoused with military necessity. A company called iRobot in Bedford, Mass., sent a prototype of its PackBot, which soldiers began using to clear caves and bunkers suspected of being mined. When the testing period was over, "The Army unit didn't want to give the test robot back," Mr. Singer notes.

Skip to next paragraph

While the use of robots that can detect and defuse explosives is growing exponentially, the next big frontier for America's military R2-D2s may parallel what happened to drones: They may be fitted with weapons – offering new fighting capabilities as well as raising new concerns.

Already, researchers are experimenting with attaching machine guns to robots that can be triggered remotely. Field tests in Iraq for one of the first weaponized robots, dubbed SWORDS, didn't go well.

"There were several instan­ces of noncommanded firing of the system during testing," says Jef­frey Jacz­kow­ski, deputy manager of the US Army's Robotic Systems Joint Project Office.

Though US military officials tend to emphasize that troops must remain "in the loop" as robots or drones are weaponized, there remains a strong push for automation coming from the Pentagon. In 2007, the US Army sent out a request for proposals calling for robots with "fully autonomous engagement without human intervention." In other words, the ability to shoot on their own.

"Let's put it this way," says Lt. Col. David Thomp­son, project manager of the Army's robotic office. "We've seen the success of unmanned air vehicles that have been armed. This [weaponizing robots] is a natural extension."

At the Georgia Institute of Technology in Atlanta, Ronald Arkin is researching a stunning premise: whether robots can be created that treat humans on the battlefield better than human soldiers treat each other. He has pored over the first study of US soldiers returning from the Iraq war, a 2006 US Surgeon General's report that asked troops to evaluate their own ethical behavior and that of their comrades.

He was struck by "the incredibly high level of atrocities that are witnessed, committed, or abetted by soldiers." Modern warfare has not lessened the impact on soldiers. It is as stressful as ancient hand-to-hand combat with axes, he argues, because of the sorts of quick decisions that fighting with modern technology requires.

"Human beings have never been designed to operate under the combat conditions of today," he says. "There are many, many problems with the speed with which we are killing right now – and that exacerbates the potential for violation of laws of war."

With Pentagon funding, Dr. Arkin is looking at whether it is possible to build robots that behave more ethically than humans – to not be tempted to shoot someone, for instance, out of fear or revenge.

The key, he says, is that the robot should "first do no harm, rather than 'shoot first, ask questions later.' "

Such technology requires what Arkin calls an "ethical adaptor," which involves following orders. Learning, he explains, is potentially dangerous when it comes to making decisions about whether to kill. "You don't want to hand soldiers a gun and say, 'Figure out what's right and wrong.' You tell them what's right and wrong," he says. "We want to do the same for these robotic systems."

The aim, says Arkin, is not to be perfect, "but if we can achieve this goal of outperforming humans, we have saved lives – and that is the ultimate benchmark of this work."

Permissions

Read Comments

View reader comments | Comment on this story