The recent deaths of a motorist and a pedestrian in separate incidents in California and Arizona, respectively, both of which are under National Transportation Safety Board investigation, raise the question of whether autonomous driving technology has become safe enough for day-to-day roadway use.
A man from Peninsula, California, died when his Tesla Model X, which data from the vehicle log showed was in autopilot mode at the time, crashed into a concrete barrier on Highway 101 in nearby Mountain View.
Other Tesla drivers reported similar experiences near this same reeway barrier and others, corroborating accounts of the overall unreliability of the autopilot system near such dividers. At least one Tesla owner who drove near the barrier in question posted several videos showing autopilot steering to the left—straight toward the divider.
Unless the autopilot technology happened to individually malfunction in each of these cases, they collectively raise the question of why a correctly functioning car would behave this way, and whether the autopilot feature should exist if it’s not always safe to operate. The Tesla autopilot system has been under review since the California man’s death.
A similar conversation has been underway in Arizona since a driverless Uber SUV, a 2017 Volvo XC 90, struck and killed a pedestrian as she walked her bicycle across the street in Tempe. Police video shows the vehicle, in self-driving mode, mowing down 49-year-old Elaine Herzberg without slowing down or changing course.
An engineering analysis of the incident, based on still frames from both the onboard vehicle video and Google Street View, determined that Ms. Herzberg walked approximately 40 feet into the roadway before the accident. Based on a normal walking speed of 3 miles per hour, this would mean that she had been in the roadway for at least nine seconds—well beyond the 1½ seconds that experts have testified an alert driver needs to recognize and react to danger by stopping.
In other words, an ordinary person could have avoided the incident—but the vehicle’s radar and lidar never “saw” Ms. Herzberg so that it could have averted the tragedy. This raises a stiff challenge to those who say that driverless cars are a technology ready for widespread adoption by Uber or anyone else.
In reaction, Arizona Governor Doug Ducey sent a blistering letter to Uber Technologies Inc., saying that the police video of the collision “raises many questions about the ability of Uber to continue testing in Arizona. Improving public safety has always been the emphasis of Arizona’s approach to autonomous vehicle testing, and my expectation is that public safety is also the top priority for all who operate this technology in Arizona.”
In response, the company posted a message on the Twitter account of Uber’s Communications Team noting that it had suspended testing of all self-driving vehicles in not only Phoenix but also San Francisco, Pittsburgh and Toronto, the other three cities where such testing had been underway.
Stay tuned for the NTSB’s conclusions on each of these incidents.