Will fatal Tesla Model S crash put the brakes on self-driving cars?

Federal highway officials are investigating a fatal accident involving a Tesla Model S that was operating in Autopilot mode.

The interior of a Tesla Model S is shown in autopilot mode in San Francisco. Federal investigators have launched a probe into a fatal crash involving Tesla Model S on autopilot, the National Highway Traffic Safety Administration announced Thursday.

Alexandria Sage/Reuters

July 1, 2016

Federal highway officials are investigating a fatal accident involving a Tesla Model S that was operating in Autopilot mode which failed to automatically activate the car's brakes leading it to crash into a tractor trailer.

The National Highway Traffic Safety Administration (NHTSA) said it is investigating 25,000 Model S sedans that are equipped with the Autopilot system, which allows for hands-free driving, a feature that has been both admired for its innovation and criticized for being launched too early. If in its investigation the NHTSA finds that the vehicles are unsafe, it could order a recall.

The safety of functions that take some of the control of steering and braking from drivers is a widely discussed topic at a time when many automakers, and other companies, such as Google, are working furiously to build self-driving cars. Some believe that these vehicles will begin appearing on US roads in 2020.

In Kentucky, the oldest Black independent library is still making history

The accident killed the car's 40-year-old driver on May 7, identified as Joshua Brown of Canton, Ohio. Mr. Brown was driving in Williston, Fla., using Tesla's Autopilot feature that was launched in October, when a tractor-trailer made a left turn in front of his car at an intersection, according to preliminary reports from the Florida Highway Patrol.

Tesla, maker of popular luxury electric cars, said in a blog post Thursday that "neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied."

To help keep regulations on pace with innovation, the NHTSA said in December that it's "raising the bar" on its 5-Star Safety Ratings, which appear on the window stickers of new cars. This is meant to encourage development of even more sophisticated safety technologies. The same technologies that will also help make driverless cars possible and safe.

Tesla updated the Autopilot feature in its Model S sedans in January, limiting its hands-free capabilities.

The carmaker restricted the feature on residential roads, or roads without a center divider, so that the car cannot drive faster than the speed limit maximum plus 5 miles per hour.

A majority of Americans no longer trust the Supreme Court. Can it rebuild?

When Tesla launched Autopilot in October, company founder Elon Musk cautioned that the highly anticipated function was in beta, or test mode, and did not recommend full "hands-off" driving.

"Autopilot is getting better all the time, but it is not perfect and still requires the driver to remain alert," the company said on Thursday. "Nonetheless, when used in conjunction with driver oversight, the data is unequivocal that Autopilot reduces driver workload and results in a statistically significant improvement in safety when compared to purely manual driving."

Despite the promise of better safety, getting drivers used to the idea of giving up control of their cars is one of the biggest barriers to widespread adoption of driverless cars.

According to a recent University of Michigan survey, two-thirds of drivers said they are moderately or very concerned about riding in a self-driving vehicle.

This report uses materials from Reuters and Associated Press.

[Editor's note: An earlier version misstated Mr. Brown's surname in one instance.]