It had to happen eventually: Google self-driving car partially blamed for bus accident
In a case of “it had to happen eventually,” a Google self-driving car has been partially blamed for an accident with a bus on Valentine’s Day, February 14.
In a statutory filing submitted (pdf) to the California Department of Motor Vehicles, Google Automotive, a branch of Google’s parent company Alphabet, Inc. said that the accident occurred while the self-driving Lexus was in autonomous mode eastbound on El Camino Real in Mountain View in the far right-hand lane approaching the Castro St. intersection.
“As the Google AV approached the intersection, it signaled its intent to make a right turn on red onto Castro St,” the report reads. “The Google AV then moved to the right-hand side of the lane to pass traffic in the same lane that was stopped at the intersection and proceeding straight. However, the Google AV had to come to a stop and go around sandbags positioned around a storm drain that were [sic] blocking its path. When the light turned green, traffic in the lane continued past the Google AV. After a few cars had passed, the Google AV began to proceed back into the center of the lane to pass the sandbags. A public transit bus was approaching from behind.”
“The Google AV test driver saw the bus approaching in the left side mirror but believed the bus would stop or slow to allow the Google AV to continue,” the report goes on. “Approximately three seconds later, as the Google AV was reentering the center of the lane it made contact with the side of the bus. The Google AV was operating in autonomous mode and traveling at less than 2 mph, and the bus was traveling at about 15 mph at the time of contact.
“The Google AV sustained body damage to the left front fender, the left front wheel and one of its driver’s -side sensors. There were no injuries reported at the scene.”
Human test
This isn’t the first case of a Google self-driving car being involved in an accident (they’re actually so conservative they get pulled up for driving too slowly), but it is the first case where the vehicle was at least partially at fault. And while it’s clear that it had to happen eventually, what it does prove is that when push comes to shove, the car has failed the human test.
Google can build as many artificial intelligence algorithms into the vehicles as they can, but ultimately humans don’t always do things logically, and in this case the vehicle presumed that the much larger bus would yield (which is logical given the Google car was in front of it) whereas many people would know that larger vehicles on the road (trucks, buses etc) often don’t as they both have the advantage of size and dominance, as well as also being far harder to maneuver in quick or emergency situations.
The one good thing that comes out of the accident is that Google now has an example of the fallacy of presuming logical outcomes from human drivers, and may be able to program their self-driving vehicles to be even more safer than they currently are.
Image credit: markdoliner/Flickr/CC by 2.0
Since you’re here …
… We’d like to tell you about our mission and how you can help us fulfill it. SiliconANGLE Media Inc.’s business model is based on the intrinsic value of the content, not advertising. Unlike many online publications, we don’t have a paywall or run banner advertising, because we want to keep our journalism open, without influence or the need to chase traffic.The journalism, reporting and commentary on SiliconANGLE — along with live, unscripted video from our Silicon Valley studio and globe-trotting video teams at theCUBE — take a lot of hard work, time and money. Keeping the quality high requires the support of sponsors who are aligned with our vision of ad-free journalism content.
If you like the reporting, video interviews and other ad-free content here, please take a moment to check out a sample of the video content supported by our sponsors, tweet your support, and keep coming back to SiliconANGLE.