I take it back. Way back. In my May column I wrote about the Arizona fatality in March involving an Uber vehicle in autonomous mode. The facts were “a bit sketchy,” I admitted, but suggested that “in a sense they don’t matter. It’s about the optics.”
Well, it certainly was about the optics in the larger context of social acceptance in the autonomous world. And they were undeniably bad, even though it was the first fatal accident involving what people call a ‘robot’ vehicle.
“I’d venture a guess that autonomy actually has little to do with this accident, that nothing could have prevented the woman’s death,” I wrote back in May. “There simply wasn’t time for any reaction, human or otherwise.”
But there actually was time, apparently, according to a 300-page report released by the Tempe Police Department in late June. And in fact the report blames the crash on distracted driving. Sound familiar?
To remind you of the circumstances, the Uber vehicle — a Volvo XC90 SUV — was doing 44 mph on a multi-lane roadway at night, apparently in Level Four autonomous mode, and simply failed to “see” Elaine Herzberg crossing the road while walking her bicycle. She was not at a crosswalk, jaywalking in other words. A so-called “backup” human driver — Rafaela Vasquez — was present, though not actively driving. Worse than that, the police report says the driver was watching ‘The Voice’ on a cell phone, and in the 20 minutes or so before the crash, her eyes were off the road some 32% of the time.
The driver in this case, and it’s clear in a video the police released on Twitter, saw the woman crossing the road only half a second before impact. The car did not brake at all.
Tempe Police Vehicular Crimes Unit is actively investigating— Tempe Police (@TempePolice) March 21, 2018
the details of this incident that occurred on March 18th. We will provide updated information regarding the investigation once it is available. pic.twitter.com/2dVP72TziQ
But police say she could have seen the victim 143 feet away and stopped the Volvo some 43 feet before hitting Herzberg if she’d been paying attention.
“This crash would not have occurred if Vasquez would have been monitoring the vehicle and roadway conditions and was not distracted,” the report stated.
Confusing the issue, the vehicle’s native collision-avoidance system had been partly disabled, according to a report by the National Transportation Safety Board. It “saw” Herzberg with six seconds to spare but did not automatically apply the brakes as it would ordinarily do. Nor did it issue a warning to the driver. The automatic braking function was turned off, the NTSB report said, “to reduce the potential for erratic vehicle behavior.” It depended on human intervention. And therein lies Uber’s big mistake, it would seem, alongside an apparent failure to screen its “backup” drivers effectively.
So I was pretty much correct in writing back in May that autonomy itself wasn’t to blame here, rather its management. What I didn’t see was the egregious human error at play.
The most glaring lesson, as if we need to hear it again, is that even a couple of seconds of distraction can be deadly. This is the most dramatic example we might imagine, but in the end it came down to seconds. Like the time it takes to check text messages on your own cell phone.