Google computer ruled a 'driver' in a big win for autonomous car technology
The National Highway Traffic Safety Administration has determined that, for regulatory purposes, the software controlling autonomous cars ca be considered a 'driver.' What, specifically, does that mean?
There are plenty of obstacles impeding Planet Earth's transition to autonomous cars, but one of the biggest involves resolving the many legal questions associated with self-driving vehicles.
In a huge win for fans of autonomous cars, though, the National Highway Traffic Safety Administration has just determined that, for regulatory purposes, the software controlling those vehicles can be considered a "driver". In a long letter to Google, the agency said:
"NHTSA will interpret 'driver' in the context of Google's described motor vehicle design as referring to the (self-driving system), and not to any of the vehicle occupants. We agree with Google its (self-driving car) will not have a 'driver' in the traditional sense that vehicles have had drivers during the last more than one hundred years."
Which sounds interesting, but what, exactly, does it mean in practical terms?
- For starters, autonomous cars won't necessarily need a human behind the wheel -- at least not to satisfy federal regulations. States may require them for a time, as California does, but as far as NHTSA and the Department of Transportation are concerned, there's no need. In their eyes, the autonomous car software is the driver.
- Along those lines, autonomous cars won't need steering wheels at all. Again, individual states may have more stringent requirements, mandating the presence of brake pedals, accelerators, and other tools for drivers to use if they need to take control from the autonomous car. But given NHTSA's ruling, we'd be surprised if states held out for long. Pressure from automakers, who would clearly be daunted by the complexities of designing autonomous vehicles for a patchwork of state regulations, could speed up the streamlining process.
- Automakers and/or software engineers may be held responsible for accidents. Questions of liability have loomed large in discussions of self-driving vehicles. NHTSA's ruling makes it clear that an autonomous car's software can legally be considered a "driver", and therefore, the designer of that software could be liable in the case of an accident.
What does the ruling not do?
- It doesn't change a range of federal standards that were written before autonomous vehicles were conceived. For example, many of the warning lights on your car's dashboard were designed to alert drivers to problems with the battery, transmission, tire pressure, and so on. For autonomous cars, those warnings may become unnecessary -- though of course, they'll have to be sent to the software driving the car.
- It doesn't establish guidelines for makers of autonomous cars to prove that their self-driving systems are up to snuff. The means of testing autonomous car systems to ensure that they work well in a range of conditions -- heavy traffic, snow, rain, etc. -- has yet to be worked out.
Bottom line: there's still a lot of work to be done, especially on the regulatory front, but NHTSA's ruling has dramatically accelerated the roll-out of autonomous cars.
The Christian Science Monitor has assembled a diverse group of the best auto bloggers out there. Our guest bloggers are not employed or directed by the Monitor and the views expressed are the bloggers' own, as is responsibility for the content of their blogs. To contact us about a blogger, click here. To add or view a comment on a guest blog, please go to the blogger's own site by clicking on the link in the blog description box above.