For more than 100 years, the definition of a car’s driver was clear: it was the person sitting behind the steering wheel, operating the gas pedal and the brake. But with the advent of self-driving cars, that definition has become blurrier: is the “driver” the person behind the wheel, who may be ready to assume manual control of the car, or is it the software that guides the vehicle?
This definition matters because of regulations governing how cars are constructed. If a car’s driver is necessarily human, the car has to have a steering wheel, pedals, and dashboard indicators for things like speed and tire pressure. But if a computer can be considered to be the driver, the car can be built without any of those features: as a closed-circuit, fully autonomous system that guides itself without direct input from the people inside.
Last November, Google, which has been working on self-driving tech since 2009, submitted a proposal to the National Highway Traffic Safety Administration (NHTSA) for a car without controls that a human could use to drive. In Google’s design, a human couldn’t take the wheel because there would be no wheel to take. And this week the NHTSA gave the company a thumbs-up on its design, ruling that a computer can be considered a “driver” for regulatory purposes.
Does that mean cars without steering wheels are legal now? Sort of. Until this week, self-driving car regulation in the US was the domain of individual states. Last December, for example, the California Department of Motor Vehicles released draft regulations for self-driving cars requiring that cars have full manual controls, and that a licensed driver be ready to take over in case self-driving software fails. California, Nevada, Michigan, Florida, and the District of Columbia are the only states that currently allow self-driving cars on public roads, and the laws in those states are meant primarily to allow companies such as Google to test their cars in real-world situations.
The NHTSA’s ruling is the first step toward a set of federal rules governing self-driving cars. In January, Transportation Secretary Anthony Foxx said the government would do everything it could to help get self-driving cars onto public roadways in a safe but timely manner, and promised to release federal safety guidelines for self-driving technology by June 2016. Those federal regulations could override rules set by California and other states, freeing automakers from having to navigate a patchwork of potentially conflicting rules as they bring self-driving cars to market. States, however, would likely still be responsible for determining when and where self-driving cars are allowed.
There’s a long way to go before the rules governing self-driving cars are totally clear, and the NHTSA cautioned in its report that Google and other companies will still have to certify that self-driving software meets safety and reliability standards. But the regulatory pace surrounding self-driving cars may finally be quickening to match that of the technology itself.