In two simulated crashes, the Waymo sideswiped another vehicle. WaymoĪ few of the incidents might suggest partial Waymo fault. (They don’t reveal how the crash was avoided in real life - perhaps the Waymo driver swerved in a way the software would not, or perhaps the oncoming driver came to his/her senses and swerved away.)ĭriver coming the wrong way gets into the lane of the Waymo - simulated head-on crash. In simulation, the Waymo came to a full stop in the lane, but was still hit by the oncoming car in the virtual. Perhaps the scariest example involved another car moving into the Waymo’s lane the wrong way, on track for a head-on collision. The software will, hopefully, learn from every simulated contact what the safety drivers did to be able to avoid the accident, so that the software becomes as good as that trained driver. They are watching the road (and software console) to be ready for potential dangerous situations in a way regular human drivers don’t. The Waymo safety drivers (two of them) are not driving, they are doing nothing but preparing to intervene. That’s good, and a worthwhile bar to aim for, but it isn’t a bar you have to make in version 1.0. I did Waymo safety driver training 8 years ago, and it involved a special defensive driving and accident avoidance program and more, and one presumes it’s even better today. Waymo responds that their safety drivers are trained to a level well above the typical driver. Clearly the Waymo software can still improve - it’s not as good as the safety drivers. And indeed, on the road it’s quite common for a person to be the one who prevents an accident somebody else would have triggered. First, in the 29 simulated contacts, the human safety driver was able to avoid the accident, even though fault lay with the other driver. Nonetheless there are two key possible factors of fault. Reading Waymo’s descriptions, it’s possible some blame might be attributed to Waymo on a couple of them, so Waymo in their formal claims says “nearly every contact.” For example, they did hit a vehicle which cut in front of them and braked for no reason, but they believe the driver did it deliberately. Most of these were not handled by police with a formal declaration of fault. Waymo believes that every contact involved improper action by the other party. It’s a more evolutionary document without grand new revelations, but is another example of good transparency which may be suitable for a later article. Waymo also released a new document describing their safety processes and thinking. The pure number of 47 would be slightly better than human if half of them were the fault of the Waymo, rather than none.īelow, I’ll get into some of the specific incidences and claims. Waymo included things like a pedestrian walking into their stopped vehicle and even an impact with somebody who appears to have deliberately cut in front of their vehicle to test it. One of the incidents took place in this vehicle, which was rear-ended while it was decelerating.) 18 of the contacts were real, 29 showed up in simulation.Ĥ7 contacts in 6.1M miles, or 130,000 miles per contact, is slightly better than the estimated level for humans when you track every single contact, including those that cause no damage and never get reported to insurance or police. (65,000 of Waymo’s miles were with no safety driver. They play out all potential contacts to find the total crashes the vehicle would have had with no safety driver aboard. Rather, a safety driver intervened, and careful replay of the event in simulator concluded some “contact” would have happened without that intervention. Note that 62% of the events described did not actually happen. With no at-fault events in 8 lifetimes of human driving, Waymo’s performance is significantly superior to a human, even in an easy place like Chandler. Nationally, 6.1 million miles of driving by a good driver should result in about 40-60 events, most of which are small dings, 22-27 or which would involve an insurance claim, 12 which would get reported to police and 6 injury crashes.driving off the road) which are pretty common with human drivers. There were no incidents of single vehicle incidents (ie.All the events had the other driver/road user at fault in some way under the vehicle code, according to Waymo.In 6.1 million miles they report 30 “dings” with no injury expected, 9 with 10% chance of injury and 8 with airbag deployment but still 10% chance of injury, suggesting less than 2 modest injuries. They have the ability to be that transparent because the numbers are good.Indeed, a gauntlet is now thrown in front of all other teams - if you’re not this transparent, we will presume you are not doing so well. There is incredible transparency, of a sort we have seen from no other team.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |