Google's cars are better than humans, but humans hate them anyway.
Human drivers may be the real cause of the vehicles’ recent fender benders: the cars keep getting rear-ended, perhaps because they distract us.
A Google self-driving car on the streets of Mountain View, California. Photograph: Google/Handout/EPA
Google’s self-driving cars are having a rough time on the streets of Mountain View,California. But a look at the evidence suggests it’s human error and not robots that are to blame.
In recent months, Google’s fleet of experimental self-driving cars have suffered five minor accidents while driving 200,000 miles around this sleepy Silicon Valley suburb. That is nearly ten times the national average for ‘property only’ fender benders, according to the National Highway Traffic Safety Administration.
Using a public records act request, the Guardian has obtained a report of the most recent incident, filed by Google in early June with the California Department of Motor Vehicles. In the report, Google notes that a self-driving Lexus was struck from behind 17 seconds after stopping at a traffic light. The other vehicle, a Honda Accord, simply drove into the back of it.
Since April, Google’s Lexus SUVs have also been rear-ended by a BMW S3, a Toyota Camry and a Ford Expedition. In each case, the Google vehicle was either stationary or travelling at less than five miles per hour, giving its robotic driver no chance to avoid an impact. In fact, when robot cars meet in unfavotable circumstances, it seems they don’t collide.
Last week it was reported that a self-driving Audi owned by Delphi Automotive took “appropriate action” to avoid one of Google’s self-driving Lexus cars after the Google car cut the Audi off on a Californian road.
But if Google’s self-driving algorithms are not to blame, why are its cars experiencing so many accidents?
One explanation could be the spinning laser scanners on their roofs, says Raj Rajkumar, designer of several autonomous cars at Carnegie Mellon University, including the winner of a 2007 self-driving vehicle competition run by Darpa, a US military research agency. “It is a distraction, and when people get distracted, I can imagine behaviours changing,” he says.
“Another reason could be that Google cars have the Google logo splashed on them, saying they are self-driving cars. People looking at that could be distracted from their normal mode of operations,” he adds. Rajkumar is now CEO of Ottomatika, a company that helped develop technology for the first vehicle to complete a transcontinental self-driving road trip, from San Francisco to New York, in March. He noticed that passing drivers would often whip out a phone to take photos or videos of his car.
Of course, the promise of self-driving cars is that they will reduce – or even eliminate – road traffic fatalities. “About 33,000 people die on America’s roads every year. That’s why so much of the enthusiasm for self-driving cars has focused on their potential to reduce accident rates,” says Chris Urmson, director of Google’s self-driving car program. He also points out that minor fender-benders like the ones in Mountain View often go unreported.
The few dozen experimental self-driving cars currently operating on public streets are packed with laser, radar, sonar and video sensors. This gives them a 360-degree view of the road ahead (and behind) that a human driver could never match. After travelling over 1.8 million miles in California, they have managed to avoid any serious accidents – and may have even prevented some from happening.
However, there has been virtually no research on how human motorists respond to robotic vehicles, says Anuj Pradhan, a behavioural scientist at the University of Michigan Transportation Research Institute (UMTRI). “We do not fully understand the human reaction where self-driving cars are involved,” he says. “It’s an important question that we haven’t started looking at yet.”
Two of his colleagues at UMTRI, Michael Sivak and Brandon Schoettle, believe that driving is far more of a human interaction than you might expect. They found that in several types of car crashes, male-to-male accidents are underrepresented and female-to-female crashes are overrepresented, suggesting that our perceptions of fellow motorists are critical. “Furthermore, in many situations, drivers make eye contact and proceed according to the feedback received from other drivers,” they say. “Such feedback would be absent in interactions with self-driving vehicles.”
When self-driving cars do become available to buy, they will be sharing the road with humans for decades to come. “Self-driving cars may have a ‘better’ driving style but it may not be a human driving style,” says Pradhan, “And that could affect how we predict or react to them.” He says that many self-driving car companies are now actively trying to humanise their algorithms to match the way people drive, slowing right down for curves, for instance, or hesitating at traffic lights.
One phenomenon that may help to reduce accidents in the short term: the distinctive and potentially distracting lidar sensors on top of vehicles are disappearing. In Google’s latest generation of self-driving cars, which received their permits to operate on California’s roads last week, the laser scanner has shrunk to a barely noticeable dome. Many other autonomous vehicles, including a Cadillac SRX built by Raj Rajkumar, hide them altogether. “We specifically made sure there was nothing on the car that makes it stand out,” he says.
Anuj Pradhan thinks a better approach might be to identify autonomous vehicles so that motorists can give them leeway. “Should self-driving cars have a special marking so we can react accordingly?” he wonders. “If I see a learner driver, I give it a little more following distance. Perhaps that’s how regular drivers would react to a self-driving car.”
Ultimately, say Michael Sivak and Brandon Schoette, we should be realistic about just how safe self-driving cars will make our highways. “It is not a foregone conclusion that a self-driving vehicle would ever perform more safely than an experienced, middle-aged driver,” they say. “And during the transition period when conventional and self-driving vehicles share the road, safety might actually worsen, at least for conventional vehicles.”
Google’s rash of rear-ends might just be a coincidence, then, but we shouldn’t expect people to stop driving into robots anytime soon.