On a bright sunny day in early May, a “beta test” autonomous “autopilot” feature on a Tesla S sedan failed to “see” a white semitrailer, failed to brake and ran under the trailer as it turned in front of the car at an intersection. The accident killed the driver, making it the first fatal crash of an autonomous car in the United States.
The crash emphasized the fact that autonomous technologies are no longer the stuff of science fiction, and intensified a debate about the safety of “self-driving” vehicles that also could affect the future of autonomous technologies in commercial trucks.
Daimler Trucks has for the last several years been steadily advancing its research on autonomous commercial vehicles, and in fact last month unveiled an autonomous city bus. Peterbilt has pursued its own direction on what it calls “driver assist” technologies, and both Peterbilt and Volvo are involved in tests of truck platooning, which many in the industry view as the first practical use of some of these autonomous technologies.
And now Tesla founder Elon Musk wants to get in on the commercial vehicle arena, as well. Last month he revealed a “master plan” that includes trucks, buses and ride sharing. In a blog posted on the automaker’s website, he says heavy-duty electric trucks are in the early stages of development and should be ready for unveiling next year.
But we’re not just talking about an electric truck — we’re talking about a self-driving one as well.
“As the technology matures, all Tesla vehicles will have the hardware necessary to be fully self-driving with fail-operational capability,” Musk wrote.
Not ready for prime time?
One question about automated vehicle technology, whether in a car or a truck, is what level of involvement is required by the driver, and the issue of how alert he or she will be of the need to take over when the autonomous technologies disengage.
In a blog post, Tesla explained that the autopilot feature in its Model S cars is “still in a public beta phase.” Contrary to videos posted by users, drivers are not supposed to be operating entirely hands-free. In the case of the fatal Tesla accident, the truck driver reported that he saw the car driver watching what he believed to be a movie on the in-cab screen.
“When drivers activate Autopilot, the acknowledgment box explains, among other things, that Autopilot ‘is an assist feature that requires you to keep your hands on the steering wheel at all times,’ and that ‘you need to maintain control and responsibility for your vehicle’ while using it,” Tesla said.
A spokesman for the Insurance Institute for Highway Safety called the crash “a warning that drivers need to be vigilant, even if a self-driving feature is engaged. It also serves as a notice that these autonomous vehicles are not just around the corner as something we can buy,” Russ Rader said.
Tesla’s Musk cautioned in unveiling the second part of his master plan that autonomy is still in its early stages.
“It is important to emphasize that refinement and validation of the software will take much longer than putting in place the cameras, radar, sonar and computing hardware … Even once the software is highly refined and far better than the average human driver, there will still be a significant time gap, varying widely by jurisdiction, before true self-driving is approved by regulators.”
Musk said he expects it will take some 6 billion miles of experience with this technology before you see worldwide regulatory approval. Currently we’re accumulating that experience at just over 3 million miles a day, he noted.
So why is Tesla deploying partial autonomy now?
“The most important reason is that, when used correctly, it is already significantly safer than a person driving by themselves, and it would therefore be morally reprehensible to delay release simply for fear of bad press or some mercantile calculation of legal liability,” Musk writes.
“According to the recently released 2015 NHTSA report, automotive fatalities increased by 8% to one death every 89 million miles. Autopilot miles will soon exceed twice that number and the system gets better every day. It would no more make sense to disable Tesla’s Autopilot, as some have called for, than it would to disable autopilot in aircraft, after which our system is named.”
Regulatory movement
At the fifth annual Automated Vehicles Symposium in San Francisco last month, government officials promised that new regulatory guidelines involving automated vehicles are imminent.
Transportation Secretary Anthony Foxx told the audience that the Department of Transportation will issue guidelines this summer on “highly automated vehicles.”
“We want people who start a trip to finish it,” Foxx said, according to published reports.
While the DOT has been working with companies that are developing automated vehicles to adapt existing safety rules to these new technology, Foxx said those existing rules are not enough.
“We need clear lines of responsibility between industry, government and consumers,” he said.
Foxx acknowledged that “autonomous vehicles are coming,” whether the world is “ready or not,” reported Fortune.com, which also quoted him as saying, “We don’t want to replace crashes with human factors with large numbers of crashes caused by systems.”
While there are many reasons why the industry is moving toward autonomous vehicles, Foxx said, “if safety isn’t at the very top of the list, we’re not going to get very far.”
At that same conference, National Highway Traffic Safety Administrator Mark Rosekind, in prepared remarks, said that, “Of course we have to do everything we can to make sure new technology does not introduce new safety risks, but we also can’t stand idly by while we wait for the perfect.”
In fact, Rosekind pointed out, the DOT “has been exceptionally forward-leaning on automated vehicles,” noting that on several locations he has commented on the irony of hearing that the federal government is moving too fast.
Why? Echoing Musk’s comments, he cited the numbers of highway fatalities. More than 35,000 people lost their lives on American roads last year, he said, the equivalent of a 747 crashing every week. And 94% of those are attributed to human error. Whether that’s driving drunk, speeding, or trying to catch Pokemon behind the wheel, he said, “if there was a way to account for all those human choices or behaviors, we would be talking about a world where we could potentially prevent or mitigate 19 of every 20 crashes on the road.
“That is the promise of automated vehicles, and that is, at its core, why NHTSA and the Department of Transportation have been so focused on doing what we can to accelerate the lifesaving promise of highly automated vehicles and connected vehicles.”
Safety advocacy groups, however, wrote a letter to Rosekind saying NHTSA should stop pushing the adoption of “robot car technology” until more testing is done and enforceable safety standards are written. Public Citizen, the Center for Auto Safety, and Consumer Watchdog said they were “dumbfounded” that the Tesla crash did not give the agency pause. They charged Rosekind and his colleagues with becoming “giddy advocates of self-driving cars, instead of sober safety regulators tasked with ensuring that new systems don’t kill people. Instead of seeking a recall of Tesla’s flawed technology, you inexcusably are rushing full speed ahead.”
0 Comments
See all comments