After a year that saw traffic fatalities jump by the highest percentage in decades, it seems even the machines are having a hard time avoiding accidents.
One of Google’s self-driving cars drove into a municipal bus in Mountain View, California last month, resulting in a minor crash.
No one in either vehicle was injured during the Feburary 14 incident, and both vehicles were moving very slowly at the time—the car at just 2 miles per hour and the bus at about 15 mph, according to reports.
Still, the accident is noteworthy as it may be the first such incident where the driverless vehicle may have been at fault.
Earlier this week, Google executives said the car bore “some responsibility” in the situation. They explained that the test driver, as well as the vehicle’s autonomous system, believed the bus would slow down or stop to let the car through while merging in traffic.
The accident occurred when the car re-entered the center of its lane and struck the side of the bus, damaging the smaller vehicle’s left front fender, front wheel, and a driver-side sensor.
The bus received minor damage in its “pivoting joint,” the flexible area in the middle of the vehicle.
In the wake of the incident, which is still under investigation by the Santa Clara Valley Transportation Authority, Google said it was working to refine the autonomous software, and that the incident could serve as a learning moment, allowing the cars to “understand” that larger vehicles like buses are less likely to yield.
It will also serve as a learning moment for insurance companies and the California Department of Motor Vehicles (DMV), which will need to parcel out the legal implications of a driverless vehicle causing an accident.
No official liability has yet been determined, and the California DMV has commented that it is not responsible for deciding who’s at fault in this situation, but that it would be working with Google to gather additional information.
Still, the incident comes at a pivotal moment for Google in particular and the self-driving car movement broadly. It occurred on the heels of an announcement by the National Highway Traffic Safety Association (NHTSA) that an autonomous system could be considered the legal equivalent of a human driver.
The proclamation not only opens the doors to federal funding for such projects but legitimizes an argument Google had been making that such vehicles would not need to include a steering wheel or pedals.
That idea was in stark contrast to regulations proposed by the California DMV requiring such functions to be in every self-driving car, in case a passenger needs to override the system.
Those regulations were based partially on information submitted to the state agency by all companies working on autonomous vehicle technology in California, showing the number of times the self-driving system had to be disengaged during test runs.
While a small number of those incidents were based on other car accidents, Google has never accepted fault for those collisions on behalf of its cars—until now.