The aftermath of a collision between a backing tractor-trailer and an autonomous shuttle bus in Las Vegas, Nevada on Nov. 8, 2017. 
 -  Screenshot: Kathleen Jacob/KVVU-TV image

The aftermath of a collision between a backing tractor-trailer and an autonomous shuttle bus in Las Vegas, Nevada on Nov. 8, 2017.

Screenshot: Kathleen Jacob/KVVU-TV image

The lunatics have taken over the asylum, folks. Taking responsibility for the operation of your own truck is no longer enough. It appears drivers may now be held at least partially responsible for any "interactions" they might have with driverless autonomously controlled vehicles. The National Transportation Safety Board has concluded that the actions of a human truck driver were the "probable cause" a fender-bender crash that occurred between a backing tractor-trailer and driverless shuttle bus in Las Vegas back in November of 2017.

The NTSB released its Highway Accident Brief on the collision last week, detailing the agency’s investigation of the Nov. 8, 2017 collision between a commercial truck and an autonomous shuttle in Las Vegas. In the Brief, the NTSB noted, "... the probable cause of the collision between the truck-tractor and the autonomously operated shuttle in Las Vegas, Nevada, was the truck driver’s action of backing into an alley, and his expectation that the shuttle would stop at a sufficient distance from his vehicle to allow him to complete his backup maneuver. Contributing to the cause of the collision was the [shuttle] attendant’s not being in a position to take manual control of the vehicle in an emergency."

Press reports from the time of the collision were already pointing fingers at the truck driver, and he was cited for "illegal backing," though the reasons for the charges were never fully explained. In a commentary I wrote at the time, I hypothesized that the driver probably assumed that the shuttle would stop before getting too close to the backing truck, and that's exactly what happened.

Comments from the driver contained in the full NTSB report, (see partial transcript below) indicate that he saw the shuttle halfway down the street while he was setting up for the backing maneuver, and as might be expected, he was focused on cars and pedestrians in his immediate vicinity. "[The driver] stated that he assumed the shuttle would stop a "reasonable" distance from the truck. The driver said that he looked back to the left and saw a pedestrian in the alley. He waited until the pedestrian cleared and then turned his attention to the right, which was when his truck hit the shuttle."

Partial transcript of NTSB's interview of the truck driver:

"I wanted to make sure that it wouldn't skim any of the cars that were parked diagonally in front of me, so I turned my head back to the front of the vehicle, looked down at the nose of the truck. I also glanced to the right really quick and I noticed the shuttle was about halfway, still halfway down from where my truck was backing into the alleyway.

So I figured okay, any reasonable assumption made would be that he would stop a reasonable distance from a backing tractor trailer. And so I turned -- okay, so I looked back to the left, and when I was looking back to the left in the alleyway, I noticed there was a pedestrian that was walking up the alleyway. I made eye contact with him. He made eye contact back. Once he was clear of the alleyway, I turned back to the right in anticipation of the truck straightening out, and that's when the collision occurred with the shuttle. It had impeded on the traffic lane that I had already set beforehand, and it was in my blind side as well. So there was no warning."

According to data downloaded from the shuttle, its sensor array detected the truck when it was nearly 150 feet away and tracked the truck continuously while it backed up. The shuttle, which was programmed to stop about 10 feet from any obstacle in its path, began to decelerate when it was about 100 feet from the truck. When the shuttle was about 10 feet from the truck and at nearly a complete stop, the shuttle attendant pressed one of the emergency-stop buttons inside "as a precaution."

The report states the truck driver continued in reverse for another 11 seconds before the collision occurred, striking the stopped shuttle bus on the left fender with the right-hand steer-tire of the truck. The bump resulted in a few scuff marks on the truck's tire, and a dent to the shuttle's fiberglass fender.

Here's where it gets bizarre; there was an attendant onboard the shuttle who NTSB says was a Class-B-licensed former motor coach driver with three years' experience. The report says the attendant pressed the emergency stop button just 10 feet from the backing truck while the shuttle was still moving slowly. There was also a manual control device onboard the shuttle, similar to a video-game controller, which the attendant used to operate the bus when it was not on its predefined route, such as when steering around parked cars or driving it onto a tow truck so it could be repositioned.

At the time of the collision, the controller was stowed away in a compartment which the attendant could not access conveniently.

"The attendant told investigators that several seconds passed after the shuttle stopped before the truck collided with it. The attendant believed the shuttle was visible to the truck driver in the right-side mirror from the time the shuttle stopped until the collision. The attendant said that he considered switching to manual mode to move the shuttle, but that he had very little time. He further stated that manual mode was not designed or intended to be used as an emergency mode," the report states.

So, if I have this right, we have an autonomous shuttle bus equipped with all sorts of sensors and cameras that could (and did) detect the truck, but still the shuttle's programming took no evasive action to avoid the collision. And, a former motor coach driver acting as an attendant on board the shuttle watched the event unfold but made no attempt at evasive action until the truck was right on top of the shuttle. And still the truck driver's actions were cited as probable cause in the collision.

The shuttle's sensor array detected the truck about 150 feet away but waited until it was about 10 feet from the truck before it stopped. 
 -  Photo: NTSB

The shuttle's sensor array detected the truck about 150 feet away but waited until it was about 10 feet from the truck before it stopped.

Photo: NTSB

Dump Your Expectations

Driving is all about expectations and our reactions to those expectations. For example, when approaching an intersection on a green light, we observe a driver opposite to us set up to make a left turn across our path. We expect that driver to wait until we have cleared the intersection before proceeding. Similarly, when driving on a highway at highway speed, we expect the driver ahead of us to maintain his or her speed and not to arbitrarily slam on the brakes for no obvious reason.

I know that the world around us today seems generally less predictable that it once was — for all kinds of reasons — but it seems we now have to start driving with the assumptions that autonomous vehicles we will soon be sharing the road with will be equally unpredictable as those motorists whose attentions might be elsewhere, like on a phone or reading a billboard sign.

Will we have to start with a clean sheet of expectations — and thus assumptions — that autonomous vehicles might behave in ways their programmers deem reasonable, but which defy decades of human experience?   

The NTSB missed the obvious on this one, or maybe chose to ignore it. The agency faulted the truck driver for assuming the shuttle would behave as a human driver would and stopped before getting in too close to the truck; and the shuttle attendant for not being in a position to take manual control of the vehicle in an emergency situation.

And besides, if this was a truly autonomous vehicle, what the heck was the attendant doing there in the first place, aside from helping passengers on and off the bus? If the attendant could be found partially to blame in this crash, he should have had complete control of the vehicle. Which begs the question, if you have an attendant on board, why not just let the attendant drive the thing?

That the shuttle did not stop and reverse on its own is troublesome, but that the attendant could have stopped and reversed the shuttle — even if it was not company policy — but failed to do so is in my mind more of a problem than truck drivers' long-held beliefs that vehicles will behave in certain ways because humans are controlling them.

The NTSB did not find any culpability on behalf the shuttle's programmers for not having it anticipate such a situation. Yes, the shuttle stopped about 10 feet before hitting the truck, but it failed to consider that the truck, just an obstacle as far as the shuttle was concerned, was still moving. Was the shuttle incapable of shifting into reverse and getting the heck out of the way as a human might have? I'd call that a shortfall in the programming and worthy of at least some responsibility.

It should be noted that the NTSB does not assign fault or blame for an accident or incident; rather, as specified by NTSB regulation, "accident/incident investigations are fact-finding proceedings with no formal issues and no adverse parties . . . and are not conducted for the purpose of determining the rights or liabilities of any person." So be it, but when you're considered a probable cause, that's pretty darned close to being blamed, I say.

The NTSB often issues recommendations with its crash reports but didn't do so with this one. I can't say why, and neither does the agency. Maybe it's all too new; unfamiliar territory and all that. Maybe it doesn't want to impede the development of autonomous technologies. But if something as basic as a low-speed people-mover type of vehicle can fail to recognize the threat posed by a reversing tractor-trailer, I can only imagine what a complex tractor-trailer traveling at highway speed might do when confronted by a situation its programmers hadn't considered.

Any driver returning from a run today that could claim he or she did everything right 99.9% of the time would be doing well by any stretch of the imagination. But these autonomous vehicles will have to be 99.999% right, 100% of the time. Apparently, we're not there yet, and that's why I'm really uncomfortable with turning these things loose in the real world. I don't think humans are ready for them yet. We obviously have some reprogramming to do first.

About the author
Jim Park

Jim Park

Equipment Editor

A truck driver and owner-operator for 20 years before becoming a trucking journalist, Jim Park maintains his commercial driver’s license and brings a real-world perspective to Test Drives, as well as to features about equipment spec’ing and trends, maintenance and drivers. His On the Spot videos bring a new dimension to his trucking reporting. And he's the primary host of the HDT Talks Trucking videocast/podcast.

View Bio
0 Comments