On February 14th, Google’s self driving car was involved in a road traffic incident in California, according to the Inquirer.For some reason the autonomous vehicle felt it had road priority over an oncoming bus. Could Google’s self driving car be feeling a little too much self-importance on the road?
An accident report filled out by Google said that the self driving car was moving about two MPH when it collided with the side of the bus, which was traveling about 15 MPH. Thankfully no one was hurt, although the converted Lexus did suffer a bit. There was cosmetic damage to the front and side of the vehicle and one of the sensors was also damaged.
The other vehicle involved was a human-controlled series 2300 New Flyer articulated bus. Now most of us drivers know that buses, trucks and other large vehicles are less likely to give way on the roads, apparently Google’s self driving Lexus did not know this. In light of this Google reviewed the accident and modified the software to “more deeply understand” that large vehicles, like buses and trucks, are less likely to yield to cars.
In a February monthly report Google said “This is a classic example of the negotiation that’s a normal part of driving — we’re all trying to predict each other’s movements,”
Critics of self driving cars have not been so forgiving with regards to the incident. Consumer Watchdog privacy project director, John Simpson, said in a release “This accident is more proof that robot car technology is not ready for auto pilot and a human driver needs to be able to take over when something goes wrong,”. Simpson later added, “The police should be called to the site of every robot car crash and all technical data and video associated with the accident must be made public.”
This isn’t the first bump in the road for self driving cars. We covered the possible traffic and environmental issues that could arise if self driving cars were to hit the streets, problems self driving cars promised to fix. Now we’re dealing with a problem on the other end of the spectrum, the software being used.
We live in a very exciting time, we’re on the cusp of having a self driving road network, with all of the advantages it promises to bring. Which is why we need to be very careful with our next steps. Accidents like these should be made public knowledge and researched intensely, not so we can halt the production of or prohibit self driving cars, but so we can ensure that when they do hit the roads, we’re as safe as possible.