Google teaches ethics to driverless cars. Can they react better than humans?

Google has been able to program cars to avoid accidents, but what will the cars do when there is no good decision? That's why Google is teaching them ethics.

|
Eric Risberg/AP/File
Google's self-driving car project could be the motivation for its new "Google Compare for Auto Insurance."

A large truck speeding in the opposite direction suddenly veers into your lane.

Jerk the wheel left and smash into a bicyclist?

Swerve right toward a family on foot?

Slam the brakes and brace for head-on impact?

Drivers make split-second decisions based on instinct and a limited view of the dangers around them. The cars of the future — those that can drive themselves thanks to an array of sensors and computing power — will have near-perfect perception and react based on preprogrammed logic.

While cars that do most or even all of the driving may be much safer, accidents happen.

It's relatively easy to write computer code that directs the car how to respond to a sudden dilemma. The hard part is deciding what that response should be.

"The problem is, who's determining what we want?" asks Jeffrey Miller, a University of Southern California professor who develops driverless vehicle software. "You're not going to have 100 percent buy-in that says, 'Hit the guy on the right.'"

Companies that are testing driverless cars are not focusing on these moral questions.

The company most aggressively developing self-driving cars isn't a carmaker at all. Google has invested heavily in the technology, driving hundreds of thousands of miles on roads and highways in tricked-out Priuses and Lexus SUVs. Leaders at the Silicon Valley giant have said they want to get the technology to the public by 2017.

For now, Google is focused on mastering the most common driving scenarios, programming the cars to drive defensively in hopes of avoiding the rare instances when an accident is truly unavoidable.

"People are philosophizing about it, but the question about real-world capability and real-world events that can affect us, we really haven't studied that issue," said Ron Medford, the director of safety for Google's self-driving car project.

One of those philosophers is Patrick Lin, a professor who directs the ethics and emerging sciences group at Cal Poly, San Luis Obispo.

"This is one of the most profoundly serious decisions we can make. Program a machine that can foreseeably lead to someone's death," said Lin. "When we make programming decisions, we expect those to be as right as we can be."

What right looks like may differ from company to company, but according to Lin automakers have a duty to show that they have wrestled with these complex questions — and publicly reveal the answers they reach.

Lin said he has discussed the ethics of driverless cars with Google as well as automakers including Tesla, Nissan and BMW. As far as he knows, only BMW has formed an internal group to study the issue.

Many automakers remain skeptical that cars will operate completely without drivers, at least not in the next five or 10 years.

Uwe Higgen, head of BMW's group technology office in Silicon Valley, said the automaker has brought together specialists intechnology, ethics, social impact, and the law to discuss a range of issues related to cars that do ever-more driving instead of people.

"This is a constant process going forward," Higgen said.

To some, the fundamental moral question doesn't ask about rare and catastrophic accidents but rather how to balance appropriate caution over introducing the technology against its potential to save lives. After all, more than 30,000 people die in traffic accidents each year in the United States.

"No one has a good answer for how safe is safe enough," said Bryant Walker Smith, a law professor who has written extensively on self-driving cars. The cars "are going to crash, and that is something that the companies need to accept and the public needs to accept."

And what about government regulators — how will they react to crashes, especially those that are particularly gruesome or the result of a decision that a person would be unlikely to make? Just four states have passed any rules governing self-driving cars on public roads, and the federal government appears to be in no hurry to regulate them.

In California, the department of motor vehicles is discussing ethical questions with companies, but isn't writing rules.

"That's a natural question that would come up and it does come up," said Bernard Soriano, the department's point man on driverless cars, of how cars should decide between a series of bad choices. "There will have to be some sort of explanation."

You've read  of  free articles. Subscribe to continue.
Real news can be honest, hopeful, credible, constructive.
What is the Monitor difference? Tackling the tough headlines – with humanity. Listening to sources – with respect. Seeing the story that others are missing by reporting what so often gets overlooked: the values that connect us. That’s Monitor reporting – news that changes how you see the world.

Dear Reader,

About a year ago, I happened upon this statement about the Monitor in the Harvard Business Review – under the charming heading of “do things that don’t interest you”:

“Many things that end up” being meaningful, writes social scientist Joseph Grenny, “have come from conference workshops, articles, or online videos that began as a chore and ended with an insight. My work in Kenya, for example, was heavily influenced by a Christian Science Monitor article I had forced myself to read 10 years earlier. Sometimes, we call things ‘boring’ simply because they lie outside the box we are currently in.”

If you were to come up with a punchline to a joke about the Monitor, that would probably be it. We’re seen as being global, fair, insightful, and perhaps a bit too earnest. We’re the bran muffin of journalism.

But you know what? We change lives. And I’m going to argue that we change lives precisely because we force open that too-small box that most human beings think they live in.

The Monitor is a peculiar little publication that’s hard for the world to figure out. We’re run by a church, but we’re not only for church members and we’re not about converting people. We’re known as being fair even as the world becomes as polarized as at any time since the newspaper’s founding in 1908.

We have a mission beyond circulation, we want to bridge divides. We’re about kicking down the door of thought everywhere and saying, “You are bigger and more capable than you realize. And we can prove it.”

If you’re looking for bran muffin journalism, you can subscribe to the Monitor for $15. You’ll get the Monitor Weekly magazine, the Monitor Daily email, and unlimited access to CSMonitor.com.

QR Code to Google teaches ethics to driverless cars. Can they react better than humans?
Read this article in
https://www.csmonitor.com/Technology/Latest-News-Wires/2014/1119/Google-teaches-ethics-to-driverless-cars.-Can-they-react-better-than-humans
QR Code to Subscription page
Start your subscription today
https://www.csmonitor.com/subscribe