A Tesla Semi truck was spotted taking a test drive on public roads in California earlier this week.  Photo: Jack Roberts

A Tesla Semi truck was spotted taking a test drive on public roads in California earlier this week. Photo: Jack Roberts

There are no certainties in this life, although the universe does seem to have a kind of Yin and Yang vibe to it.

That was certainly the case with Tesla this past week. It started out with the news that a Tesla Semi all-electric truck was spotted running (quietly) out on public roads, and ended up with the news that the National Transportation Safety Board is sending two investigators to look into a collision on Monday in Culver City, California, between a Tesla Model S and a fire engine. Early reports indicate the Tesla was in autonomous driving mode when the accident occurred. And according to the Culver City Fire Department, “amazingly,” there were no injuries in the accident.

The Tesla Semi spotted in the wild last week is almost certainly the same matte black model I inspected up close at the official launch of the truck late last fall. That truck was fitted with rear-view cameras instead of mirrors – but even mighty Tesla has to bow to federal highway regulations. So this example was, indeed, fitted with mirrors while out getting some crucial real-world evaluation miles under its belt.

The takeaway here is that Tesla is proceeding on pace with its plans to have Semi trucks ready to enter into serial production early next year. And if you’re a CDL-holding truck journalist (ahem!) it's a promising sign that you may get to take one for a spin, too, before this young year wraps up.

And while the Tesla accident, and the just-announced NTSB investigation into the causes of the crash, is not exactly a step back for autonomous vehicle technology, it is a timely reminder that for all the promises made by autonomous vehicle system developers, the technology is not quite ready for prime time, yet.

Autonomous vehicle speed bumps

When the NTSB investigated the now-famous 2016 Florida crash of a Tesla Type S in Florida, the agency ultimately placed some blame on the manufacturer for the accident, noting that “operational limitations” in the car’s autonomous control system played a “major role” in the accident. These limitations, according to NTSB, included Tesla being unable to ensure that drivers remained focused on the road and other traffic while the car was in autonomous driving mode or making sure the system could only be engaged on certain types of roads were the system performs best.

For its part, Tesla still insists that its Auto Pilot system, as it calls its autonomous vehicle controls, “significantly” enhances safety, while noting an earlier government study that found it significantly reduces crashes while in use.

Tesla also stressed again that Autopilot is “not a fully self-driving technology,” and added that “drivers need to remain attentive at all times” – even when the system is engaged.

All of which is a timely reminder that as amazing as computers are today, they are still severely limited in their ability react to unforeseen instances, and to learn from both the successes and failures that result from attempting to deal with them.

A lot of time, money and brain power is currently going into efforts to create “deep learning” computers. And it’s likely that we’ll eventually get to the point where a vehicle system can do a much better job at processing a highly dynamic, constantly changing operating environment such as a public roadway.

While I’m no computing expert, it does seem at the moment that vehicle speed, and an autonomous system’s ability to deal with incoming data, are the primary hurdles to widespread autonomous adoption. I say that because in both Tesla crashes, the cars were traveling at 65 mph. So it’s possible that the systems cannot yet deal with processing data at those speeds.

There have also been several widely reported low-speed autonomous driving accidents as well. And in many of those cases, it appears the system was doing just fine on its own. The problem was interaction and adjusting to the unpredictable actions of human drivers, who – as we all know – tend to have a somewhat "flexible" approach to dealing with circumstances such as yellow lights, stop signs, pedestrian crossings and the like.

Very few technologies ever spring right out of the box fully formed and ready to go. So the odds are that these accidents are simply part of the learning curve for all of us. I still believe we will see increased use of semi-autonomous cars and trucks over the next five years or so as the technology improves. And I still believe that many truck drivers will actually like these systems a great deal once they get familiar with them. Perhaps the lesson here and now is that we’re expecting too much from our computers too soon and need to remember to stay focused on the road, no matter how glitzy all this stunning new automotive technology is.

About the author
Jack Roberts

Jack Roberts

Executive Editor

Jack Roberts is known for reporting on advanced technology, such as intelligent drivetrains and autonomous vehicles. A commercial driver’s license holder, he also does test drives of new equipment and covers topics such as maintenance, fuel economy, vocational and medium-duty trucks and tires.

View Bio
0 Comments