Why concerns about self-driving cars may require a more hands-on approach
With a demonstration of vehicles from BMW, GM, and Tesla on Tuesday, self-driving cars made a splash in Congress. But some lawmakers expressed concerns about the safety and security of autonomous vehicles as federal regulators mull creating a unified standard.
Congress’ first comprehensive look at self-driving cars was titled “Hands Off,” a seemingly appropriate moniker for carmakers and tech companies’ hopes to take a unified, but also relaxed, approach to the regulations that could eventually help put the cars on roads across the country.
But during Tuesday’s hearing, some lawmakers from the Senate Commerce Committee indicated that the move toward self-driving cars may be more hands-on — literally in one case. A note of concern crept into questions about how the companies would address privacy issues and questions about whether the cars could be remotely hacked.
“You can imagine in this world of cybersecurity and cyberattacks, imagine what would happen to autonomous vehicles to get hacked while they’re out on the road,” said Sen. Bill Nelson (D) of Florida. “One small defect could end up in a massive safety crisis.”
The lawmakers and representatives from Google, carmakers Delphi and GM, and ride-hailing firm Lyft – which recently partnered with GM – agreed that more robust federal standards were needed.
Anthony Foxx, the transportation secretary, has pledged to introduce federal standards in the next six months. Last month, vehicle safety regulators also told Google in a letter that its self-driving-car software could be considered a driver under federal law.
But as many states have considered allowing companies to test the technology, the carmakers have decried what they call “a patchwork” of conflicting state laws.
“If every state is left to go its own way without a unified approach, operating self-driving cars across state boundaries would be an unworkable situation and one that will significantly hinder...the eventual deployment of autonomous vehicles,” said Chris Urmson, head of Google’s self-driving car project.
The tech giant, which says it has driven more than 1.4 million miles on roads in California, Texas, and Washington state, has pushed back against California’s draft rules, which require the cars to have manual controls and a licensed driver in order to conduct tests.
During Tuesday’s hearing, questions about the cars' security remained.
Under harsh questions from Sen. Ed Markey (D) of Massachusetts and Sen. Richard Blumenthal (D) of Connecticut about whether they would embrace mandatory standards to prevent cars from being hacked, many of the carmakers demurred or gave what the lawmakers said were non-committal answers.
“We are in support of well thought out cybersecurity principles, whatever principles are put in place need to be standard across the states,” said Joseph Okpaku, vice president of government relations at Lyft.
“I understand what you’re saying,” responded Senator Markey, “but witnesses sat here 30 years ago and said the same thing about airbags, that companies would protect people, and moving forward we just want to make sure that people are protected.”
Mary Cummings, a robotics professor at Duke University who studies interactions between humans and technology, called the cars “one big data-gathering machine.” It was unclear who would have access to data generated from cars with autonomous features, such as information about where a car has traveled, she said.
Dr. Cummings also expressed skepticism about the carmakers’ assurances of safety, saying that with the technology constantly evolving, the National Highway Traffic Safety Administration, which told Google its cars could be considered a driver, wouldn’t be able to keep up with the need for safety tests.
“There is no question that someone is going to die in this technology, but the question is when and what can we do to minimize that,” she said. “I think I speak for many in the robotics community to say we’re strong advocates of this technology but if a death, a fatality, were to occur soon, at the wrong time, it could really set back the integration of this technology which I fully think will help prevent those deaths on the road."
She said she would hope to see data on safety tests made available to academics or other independent groups.
The carmakers sought to keep the focus on the cars’ benefits, particularly their potential to reduce the rate of accidents, many of which are caused by human error, federal regulators say.
Mike Ableson, vice president of strategy and global portfolio planning at GM, told the lawmakers that ride-sharing and car-rental programs could be a key way to get a larger number of people exposed to the technology, particularly in under-served communities where people often rely on public transportation.
“The first five minutes are often a little tense, but after people think, ‘That car drives better than me.’ ” We’re fairly confident that once people try it out, they’re going to enjoy it,” said Dr. Urmson of Google.
Senator Nelson, the senate committee’s ranking member, described a different experience. During a demonstration of several partially autonomous vehicles, he said, his instincts kicked in while riding in a Tesla. As the car neared a sharp turn onto DC’s I-395 on the way back to the Capitol, he said, it began speeding up as it neared a concrete wall. Alarmed, he engaged the car’s manual controls and slowed it down.
“I said, ‘What would have happened,’ ” Nelson said during the hearing. “[Test engineers from Tesla] said ‘If you had left your hands off the wheel, it would have made that sharp turn and come on around.’ So I’m here to tell you that I’m glad I grabbed the wheel,” he said, drawing laughs. “But we know if this is working as it apparently is, then there are many lives that could be saved by preventing accidents,” due to distracted driving.