Does Tesla's Autopilot feature mislead drivers?

Consumer safety watchdogs question the ethics of releasing an auto safety feature that is still in beta testing, and wonder if the branding of 'Autopilot' has erroneously led drivers to place too much trust in the technology.

The interior of a Tesla Model S is shown in autopilot mode in San Francisco, April 7. Consumer safety watchdogs are calling for Tesla to pull back on its Autopilot feature in light of crash concerns.

Alexandria Sage/Reuters

July 15, 2016

Tesla has been asked to brief the US Senate committee over auto safety issues about the May crash that killed the vehicle’s occupant while the car was on Autopilot mode.

Consumer safety watchdogs now say that Tesla may be moving too fast with the introduction of its Autopilot software.

Since the crash, Consumer Reports magazine has urged Tesla to disable the automatic steering mechanism, saying that marketing the mechanism as “Autopilot” may lead drivers to believe that they do not have to retain control of the car, despite Tesla’s assertions to the contrary.

OK, she’s worth $1 billion, but can Taylor Swift write poetry? We ask the experts.

"We're deeply concerned that consumers are being sold a pile of promises about unproven technology,” Laura MacCleery, vice president of consumer policy and mobilization for Consumer Reports said in the statement. "'Autopilot' can't actually drive the car, yet it allows consumers to have their hands off the steering wheel for minutes at a time."

The crash that led the US Senate to summon Tesla for testimony occurred on May 7. Driver Joshua Brown was killed when Autopilot failed to recognize a turning tractor trailer in its path.

Some analysts suggest that calls to disarm the Autopilot systems are premature, pending conclusive investigation into the crash.

"Despite an avalanche of hit-pieces and misguided op-eds," writes tech journalist Yoni Heisler for BGR, a consumer electronics publication, "it’s far too soon to say with any certainty that Tesla’s Autopilot software has been the direct cause of any specific crash."

In a blog post about the accident, Tesla wrote, “Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied,” leading to Mr. Brown’s death.

Columbia’s president called the police. Students say they don’t know who to trust.

Autopilot uses several tools, including cameras and radar, to survey the vehicle’s surroundings and perform functions such as braking at stoplights and changing lanes on the highway. Yet, despite its name, Tesla says that the mechanism is intended to reduce the burden of driving on the vehicle’s operator, not remove it completely.

Several consumer watchdogs, however, say that the name creates a false sense of security, and that Tesla should scale back its Autopilot program before more people get hurt. Consumer Reports asked Tesla to change the name of the program, which Tesla says is still in public beta testing.

Two other recent crashes have increased public scrutiny of Autopilot. Most recently, a Pennsylvania crash involving an Autopilot enabled vehicle prompted a US National Highway Traffic Safety Administration (NHTSA) investigation.

Tesla's chief executive officer, Elon Musk, wrote in a tweet that while the driver in the Pennsylvania crash claimed Autopilot was turned on, investigation showed that it was not. And if it had been turned on, Mr. Musk says, the car would not have crashed.

A Montana crash on Sunday, however, occurred while the Autopilot feature was turned on. The driver’s hands were not on the wheel.

This week, NHTSA announced that it wanted records of how often Autopilot drivers were told to keep their hands on the wheel.

Consumer Reports also says that Tesla should never have allowed consumers to purchase vehicles with a feature that, as the company itself says, is not yet out of beta testing.

“Consumers should never be guinea pigs for vehicle safety 'beta' programs,” said Ms. MacCleery. “At the same time, regulators urgently need to step up their oversight of cars with these active safety features. NHTSA should insist on expert, independent third-party testing and certification for these features, and issue mandatory safety standards to ensure that they operate safely."

Tesla announced that while the company appreciates recommendations by critics such as Consumer Reports and Computerworld, it will make decisions about the future of its products, including Autopilot, based on “real-world data.”