An accident report has finally been revealed of what is known to be one of the first accidents caused by a Google car after it collided with a bus in Mountain View on Valentine’s Day.
Although the accident occurred more than 15 days ago, it was only reported yesterday in an accident report to the department of motor vehicles which was posted by the agency.
The bus, traveling at a speed of 24km/hour, and the Google driven car at 3kp/hour upon collision resulting in no injuries.
According to Google, the vehicle was attempting to manoeuvre around a few loosely placed sand bags on the street, when it collided with the front left hand side of the bus.
According to the state law, the vehicle must have a human companion at all times when traveling on public roads while on “autonomous mode”. The test driver who is needed to take control in emergency situations believed that the bus would yield, and was not in control when the incident occurred.
If the Google car is found to be the cause of the incident, it would be the first time that an accident was caused by the vehicle under the autonomous mode.
Department of Motor vehicle spokesman, Jessica Gonzalez says that they hoped to speak with Google on Monday regarding what went wrong at the time.
In a detailed monthly report, Google said that accident occurred on a busy 6 lane boulevard with many intersections.
The vehicle had been following a recent software update termed “Spirit of the road” as well as sticking to the far side of the right hand side of the road to allow other cars to pass on the left.
“It then detected sandbags near a storm drain blocking its path, so it needed to come to a stop. After waiting for some other vehicles to pass, our vehicle, still in autonomous mode, began angling back toward the centre of the lane at around 2mph – and made contact with the side of a passing bus traveling at 15mph. Our car had detected the approaching bus, but predicted that it would yield to us because we were ahead of it,” Google said.
“Our test driver, who had been watching the bus in the mirror, also expected the bus to slow or stop. And we can imagine the bus driver assumed we were going to stay put. Unfortunately, all these assumptions led us to the same spot in the lane at the same time. This type of misunderstanding happens between human drivers on the road every day.”
Google says it has refined its software following the incident, acknowledging that buses and other large vehicles are less likely to yield. “In this case we clearly bear some responsibility because if our car hadn’t moved there wouldn’t have been a collision.”
“We hope to handle situations like this more gracefully in the future.”
Hilary Rowen, a partner at the insurance regulation practice Sedgwick LLP and an expert in the issue of self-driving cars and legal responsibility, said the case is a good example of a conundrum that will soon be common.
“Here, the software didn’t avoid the accident, but the human could have taken over,” she said. “Who’s at fault – the driver, the bus driver, or the software?
Rowen said in real world situations, both the driver and injured party will actually be incentivized to blame the software which, if found to be guilty, will leave the driver’s record clear and likely have a higher pay-out for the injured party.
“Everybody’s going to be blaming the software all the time,” Rowen said. “All the time.” Rowen still thinks autonomous car insurance will be cheaper than human-driven car insurance because humans aren’t very good drivers.
“At a very visceral level, people will accept a higher chance of being maimed or killed by a human being than they will by being maimed or killed by software,” she said. “The self-driving car will likely be able to make better risk calculations.”
If the DMV considers the Google car to be at fault for the collision, it could be seen as a setback for the company’s ambitious autonomous vehicle plans.
The bus crash came just four days after a legal breakthrough for the self-driving project – the US National Highway Traffic Safety Administration told Google it would likely give the self-driving computer the same legal treatment as a human driver.
That decision would pave the way for self-driving cars without any typical controls, such as a steering wheel or pedals.