Driverless Danger?

The Tempe, Ariz., death raises questions about the operational readiness of autonomous vehicles and how to determine responsibility for system failures

uberlidar

A self-driving SUV owned and operated by Uber struck and killed a pedestrian in Tempe, Ariz., March 18 in the first recorded fatality involving an autonomous vehicle (AV). Uber has since halted testing of self-driving cars on public roads.

The vehicle, a Volvo, was in full autonomous mode but was occupied by a human “safety driver” able take manual control in the event of a system malfunction. The local police department, the National Transportation Safety Board, and the Department of Transportation’s National Highway Traffic Safety Administration (NHTSA) have all launched investigations of the collision hoping to discover exactly why the car was unable to detect and react to the pedestrian. The victim was identified as 49-year-old Elaine Herzberg.

On Wednesday, Tempe police released graphic video footage captured by the Uber just before the crash. In the video, Herzberg emerges from a shadow directly in front of the vehicle. With the lighting shown in the footage, it would have been difficult for a human driver to see Herzberg in the road with enough time to avoid a collision. However, a human may have been able to swerve or slam the brakes at the last second. The AV didn’t change its course or speed, indicating it didn’t detect any reason to, and also a failure of the vehicle’s LiDAR and/or radar sensors, which should have detected Herzberg, even in the dark.

Police also released internal dash-cam footage of the vehicle’s safety driver in the seconds before the collision. The driver appears to be distracted, and can be seen looking down and away from the windshield for a considerable amount of time before looking up at the moment of impact. This raises eyebrows about the practices of Uber’s safety drivers and suggests self-driving technology may still be too immature to operate without a constantly attentive human co-pilot.

Legally, because Herzberg crossed in front of a moving vehicle, in the dark, and outside of a crosswalk, it would be difficult to find Uber solely responsible.

“I suspect preliminarily it appears that the Uber would likely not be at fault in this accident,” said Tempe Police Chief Sylvia Moir in an interview with the San Francisco Chronicle. But, she added, “I won’t rule out the potential to file charges against the (backup driver) in the Uber vehicle.”

The fatal accident raises many questions that venture into new legal, and perhaps ethical, territory. 

But what if the AV’s autonomous system is found at fault? The underlying question is: at what level do we begin holding such systems, or their creators and owners, responsible for their actions? How would such a situation be prosecuted? Lawmakers will surely receive pressure from the tech community and the public to provide answers before self-driving cars become widely operational.

Additionally, the crash highlights the need for redundant caution when testing imperfect and emerging technologies in a public space, and is a reminder of the progress that needs to be made in the driverless domain. CNET reports some regulators (and many civilian drivers) are growing more concerned, questioning whether autonomous cars are fit for public roads at all.

While this incident was tragic, it doesn’t likely mark the end of the driverless revolution. According to NHTSA data, there were 37,461 vehicle-related fatalities in 2016—all with humans behind the wheel. Some proponents of AVs argue the technology is actually more reliable than a human at the wheel, and may help reduce these statistics. Driverless vehicles don’t necessarily need to achieve perfection—but they should improve upon the safety standards set by human operators.

Human drivers possess a defensive instinct that protects us on the road. We intuitively know how to react when humans behave unpredictably. We are taught in driver’s education to anticipate that the child playing in the yard up ahead could potentially dash into the street after a soccer ball. Until AVs can adjust in an equal or more reliable way, human drivers will continue to rule the world’s highways. Once these vehicles earn human trust, however, the public may grow more accepting of occasional accidents as is the case with traditional vehicles.

Photo Credit: Natalie Behring/Reuters

The GEOINT Response

As confirmed COVID-19 cases continue to rise, GEOINTers deliver valuable insights into the pandemic response

A New Wave of Capabilities

5G holds the promise of increased data speeds and improved security features

Smart City Standards

DHS and OGC bring smart city standards to the public safety community with St. Louis SCIRA pilot