Google says its self-driving cars are actively learning to recognize potentially dangerous situations, even while the cars are parked.
On Saturday, groups of children sporting a variety of costumes celebrated Halloween at the company’s Mountain View, Calif., headquarters. Google took the opportunity to ask the kids to walk around a series of parked self-driving cars, the company related in a blog post.
This procession of Halloween costumes served as a valuable learning tool for the cars, which the company has been actively testing on roads in California and Texas.
“This gives our sensors and software extra practice at recognizing children in all their unique shapes and sizes, even when they're in odd costumes,” Google wrote.
It’s also particularly valuable because children can move erratically – especially while in pursuit of candy – making them difficult to see behind parked cars and other objects. In response, the company says the cars have been “learning” to drive more cautiously around children.
Google has often touted the cars as a safer, more predictable alternative to human drivers, pointing out that human error is responsible for 94 percent of all accidents, according to the National Highway Traffic Safety Administration.
In logging about 1.7 million miles of self-driving and manual driving combined – or about 10,000 miles a week – the company says its software is constantly learning to recognize a variety of dangerous situations on the road and figure out how to avoid them.
“As it turns out, what looks chaotic and random on a city street to the human eye is actually fairly predictable to a computer,” wrote Chris Urmson, director of the self-driving car project, in a post in July 2014. “As we’ve encountered thousands of different situations, we’ve built software models of what to expect, from the likely (a car stopping at a red light) to the unlikely (blowing through it).”
But the cars have faced a number of accidents – including one in July where a self-driving Lexus SUV was rear-ended by another car at an intersection, causing the driver in Google’s car to suffer “minor whiplash” while tearing off the front bumper off the other car, according to a post by Mr. Urmson. Google says the accidents were caused by other drivers, noting that a key mission of the self-driving car project is to prevent distracted driving.
A recent study by researchers at the University of Michigan’s Transportation Research Institute found that self-driving cars were five times as likely to get into accidents as cars driven by humans, though the self-driving cars were not at fault. In the three-year study, researchers Brandon Schoettle and Michael Sivak found that self-driving cars had 9.1 crashes for every million miles driven, compared to 1.9 for human-driven cars.
But with only 11 reported self-driving car accidents and only a total of 50 self-driving cars, compared to 269 million regular vehicles, the researchers noted that their conclusions were somewhat limited.
But, Mr. Schoettle told Vox, it’s also possible that self-driving cars are getting into more accidents because they are behaving in ways that can surprise humans used to interacting with other human drivers.
Google has often responded that drivers should pay more attention to the road – with Urmson noting in a recent post that Google engineers observed people reading books or even playing a trumpet while driving.
The company says efforts like having the cars “observe” how children and pedestrians move will help its self-driving project do better.
“So even if our cars can’t quite appreciate the effort the kids put in dressing as their favorite character from Frozen, they’re still paying full attention,” the company wrote in its post.