In most places, automated driving is limited – both by technology and by law – to human-supervised joyrides, but a rule change in California could pave the way for a future where you can nap while your car drives you to work.
Nearly 30 companies are testing automated cars on California’s roads, led by the original self-driving car advocate, Google. Until now, regulations have required an alert human to sit behind the wheel, ready to take over at any time, and that the technology must serve testing, not commercial purposes. On Friday, the state released proposed Department of Motor Vehicle amendments that lay the legal ground for cars to operate with no driver.
But some question whether the technology – or the public – is ready for computers to shift from a driving assistant to a full replacement.
The public has 45 days to comment, after which a public hearing will follow. These changes will bring California, the country’s biggest car market, up to speed with Michigan, which passed a similar law last December.
“These rules expand our existing autonomous vehicle testing program to include testing vehicles where no driver is present,” said DMV Director Jean Shiomoto in a press release. “This is the next step in eventually allowing driverless autonomous vehicles on California roadways.”
But the Golden State won’t be letting these fledgling robot drivers out completely on their own just yet. One provision requires companies to monitor vehicles remotely, with a human ready to step in if something goes wrong.
For now, it’s just a learner’s permit.
While proponents such as Eric Nobel, president of consulting firm CarLab, consider the changes “necessary and timely,” not everyone's a fan of the plan, despite the safeguards. One sticking point? Who ensures the cars are up to the task.
The law reads, "The manufacturer shall certify that the autonomous technology is designed to detect and respond to roadway situations in compliance with all provisions of the California Vehicle Code and local regulation applicable to the operation of motor vehicles, except when necessary for the safety of the vehicle’s occupants and/or other road users."
Many doubt that the companies themselves are the best judge of proficiency. "That’s like me going to the DMV and saying, 'Believe me, I’m an excellent driver,' " Ryan Calo, who studies robotics law at the University of Washington School of Law, told Wired. "It makes me a little nervous, honestly."
While the DMV describes the new rules as an expansion of existing laws, they represent a significant shift from thinking of technology as an occasional aid for human drivers, to a perfect replacement for them.
Such a change casts the language of the law, which grants cars the right to overrule the California Vehicle Code when necessary for safety, in a new light. Until now the buck stopped with the human driver, but when a truly self-driving car gets in an accident, who is responsible?
An emerging consensus seems to point to the manufacturers. For now, cars like the Tesla require drivers to take responsibility by reminding them to keep their hands on the wheel, but Volvo has declared it will pay all damages caused by its IntelliSafe Autopilot system, currently set for a 2020 release. A 2014 Brookings Brookings institution also found that current product liability laws could support this perspective, Scientific American reports.
Yet even if your car manufacturer picks up the bill, when will people be ready to trust their well-being to a self-driving machine? Advocates point to Google’s stellar safety record and two key statistics: Driving with Tesla’s autopilot reduces crashes by 40 percent, and human error accounts for 94 percent of crashes.
But other reports suggest that such claims of machine superiority are premature.
A University of Michigan study of Google, Audi, and Delphi driving records found that those systems have a significantly higher crash rate per million miles than human drivers, despite being generated under easy testing conditions like sunny weather and slow speeds (although Google is expanding testing to rainy Seattle). What’s worse, limited data gives the comparison a bit of an apples-and-oranges problem.
As of last fall, Google’s fleet had logged a collective 2 million miles. Their hive brain shares learning experience, giving it an advantage over the average isolated human who sees only 13,000 annual driving miles, but in terms of collective safety statistics, we still know a lot more about humans.
Americans log over three trillion miles a year, a million times more than the lifetime mileage of Google’s fleet. Up against those numbers, a RAND Corporation report found that it would take existing fleets “tens and sometimes hundreds of years to drive these miles.” That's a tall order for the many companies predicting cars on the market in the early 2020s.
"Even if autonomous vehicle fleets are driven 10 million miles, one still would not be able to draw statistical conclusions about safety and reliability," RAND senior statistician Susan Paddock summarized in a company statement.
But for many, these engineering problems must be overcome as a matter of public health. Advocates point to the 30,000 to 40,000 annual fatalities caused by traffic crashes.
Moreover, automated driving could improve the lives of millions by granting more autonomy to the elderly and disabled, not to mention all that extra productive time people could enjoy during their commutes.
The technology even has the potential to completely revolutionize our transportation system, making traffic, parking lots, and gas stations a thing of the past, because who would choose the burden of owning and maintaining a car when one could be summoned on demand.
Going forward, all eyes will be on states like California and Michigan, waiting to see how car companies respond to the more lenient regulations and if they can deliver the miracles they’ve promised.
As goes California, so goes the country, self-driving-car researcher Bryant Walker Smith told Wired. “The DMV’s rules are going to shift a big part of the conversation to the federal level.”