The NTSB report goes on to say that the active safety features that come standard in the XC90 that was being tested, like forward collision detection and automatic braking, likely would have reacted in time to reduce the severity of the collision that killed 49-year-old Elaine Herzberg. She was struck by an Uber self-driving car when crossing an unlit roadway at night in March of 2018 in Tempe, Arizona.

But Uber’s software overrides the vehicle’s built-in safety features, which the ride-sharing company says it does “to reduce the potential for erratic vehicle behaviour,” says the NTSB report. Instead, an Uber worker riding along in the driver’s seat is charged with monitoring the car’s performance, watching for potential crashes and taking over control of the car in the event of an impending collision.

But that didn’t happen in the fatal March crash because the Uber staffer in the Volvo was watching a TV show on her smartphone in the 42 minutes prior to the crash, according to a June 2018 report by the Tempe, Arizona police department.

The NTSB report shows that Uber’s sensors did detect Herzberg before the impact. About six seconds before the crash, the software classified her as an “unknown object” before changing its assessment to label her as a vehicle and then, finally, a bicycle. At 1.3 seconds before the car hit Herzberg, Uber’s software indicated a need for evasive action, but because the computer controls don’t relay that to the vehicle’s operator, the woman behind the wheel didn’t know Herzberg was there until less than a second before impact, and didn’t apply the brakes until after making contact.

The Insurance Institute for Highway Safety (IIHS) says that in its testing, the Volvo XC90’s safety systems have proven capable of avoiding a pedestrian in the car’s path at speeds around 60 km/h. The Volvo that struck Herzberg was travelling just over that speed at the time of impact, which suggests that had Volvo’s safety features been active, they could have at least mitigated the severity of the impact and probably would have saved Herzberg’s life.

“What’s chilling is that the engineers behind Uber’s software program disabled the system’s ability to avoid a life-or-death scenario while testing on public roads,” says David Zuby, the Insurance Institute for Highway Safety’s (IIHS) chief research officer. “Uber decided to forgo a safety net in its quest to teach an unproven computer-control system how to drive.”

He called that “unacceptable,” adding that, “To be better than human drivers, automated systems have to make safer choices.”