TopNews

NTSB to Investigate Latest Autonomous Vehicle Crash

November 10, 2017

SHARING TOOLS        | Print Subscribe
NTSB will investigate the latest autonomous vehicle crash, which occured the first day on the job for this AAA/Keolis electric van. Photo: Twitter
NTSB will investigate the latest autonomous vehicle crash, which occured the first day on the job for this AAA/Keolis electric van. Photo: Twitter

Despite human error appearing to be the cause of the crash of an autonomous shuttle van in Las Vegas earlier this week, the National Transportation Safety Board is sending investigators to look at the accident more closely, according to a report by Reuters News Agency.

According to news reports, an autonomous, electric-powered van called Navya Arma, operated by Keolis North America, went into service in Las Vegas on Nov. 8. A few hours after starting its maiden voyage, a delivery truck backed into the stopped shuttle, according to a reporter on the shuttle and one of its sponsor companies. The Automobile Association of America, which is co-sponsoring the autonomous van’s deployment with Keolis, said it would assist NTSB in the investigation.

Past accidents involving autonomous vehicles have been blamed on the inflexibility of the programing in a dynamic driving environment and inability to adapt to minor situational variations as a human driver would. And that seems to be the case in this instance as well.

According to Reuters, reporter Jeff Zurschmeide, who was on the shuttle at the time of the crash, said the self-driving vehicle did what it was programmed to do-- but not everything a human driver might do.

“That’s a critical point,” Zurschmeide wrote on digitaltrends.com. “We had about 20 feet of empty street behind us (I looked), and most human drivers would have thrown the car into reverse and used some of that space to get away from the truck. Or at least leaned on the horn and made our presence harder to miss.”

Indeed, it seems a fundamental flaw in autonomous vehicle programming is the technology’s inability to deal with unpredictable human drivers. According to Reuters, the Las Vegas crash follows a rising number of incidents involving human drivers behaving improperly or recklessly around self-driving cars.

There have been 12 such crashes in California alone since Sept. 8 involving General Motors Co’s self-driving unit, Cruise Automation. All were the fault of human drivers in other vehicles, GM told regulators.

Comments

  1. 1. TEF [ November 12, 2017 @ 07:43AM ]

    The autonomous bus needs to learn when to honk when a collision is imminent. When you are slowly backing blind out of a parking spot, you need other drivers to either stop or make their presence known.

 

Comment On This Story

Name:  
Email:  
Comment: (Maximum 2000 characters)  
Leave this field empty:
* Please note that every comment is moderated.

Newsletter

We offer e-newsletters that deliver targeted news and information for the entire fleet industry.

GotQuestions?
sponsored by
sponsor logo

ELDs and Telematics

Scott Sutarik from Geotab will answer your questions and challenges

View All
GotQuestions?

Sleeper Cab Power

Steve Carlson from Xantrex will answer your questions and challenges

View All