While the results from crash investigations are still pending, two recent crashes of the Boeing 737 Max may serve as examples of what happens when humans and safety technologies are not on the same page. 
 - Photo via Aka The Beav on Flickr

While the results from crash investigations are still pending, two recent crashes of the Boeing 737 Max may serve as examples of what happens when humans and safety technologies are not on the same page. 

Photo via Aka The Beav on Flickr

My 16-year-old totaled my car last month. No injuries, thank goodness, but that and a few recent headlines got me thinking about safety technology.

I had to wonder, would this crash have been avoided or at least mitigated by forward collision warning and/or automatic emergency braking? It was the result of a moment of inattention, spending just a little too long looking for an unfamiliar HVAC control. Perhaps instead of adhering to the conventional wisdom of teens driving an older car so the damage is less costly when they wreck it, I should be buying my kid a new car with the latest safety technology.

But safety technology isn’t always everything it’s cracked up to be – because no matter how sophisticated it is, you still have to consider how it’s going to interact with the human behind the wheel.

That factor was evident in some recent headlines. One was a Delray Beach, Florida, crash of a Tesla Model 3, which underrode the side of a tractor-trailer making a left turn onto a divided highway, tearing off the Tesla’s roof as it passed under the trailer, killing the driver. So far reports have not indicated whether the Autopilot or automatic emergency braking systems were working at the time, but the crash was eerily like one that happened in May 2016 near Gainesville, Florida, where a Tesla on Autopilot failed to “see” a white semitrailer turning in front of it and ran under the trailer, killing the driver.

Tesla’s Autopilot is not fully autonomous technology, as the name might make some people think. It’s an advanced driver assistance system, with instructions for drivers to keep their hands on the wheel, monitor their surroundings, and be ready to take over at a moment’s notice.

If you were going to have this type of technology in a fleet vehicle, you’d want to be darn sure you had training and policies in place to make sure drivers were using it correctly.

And speaking of training and safety technology, based on one newspaper investigation, it looks like that may have been at issue with the recent highly publicized fatal crashes of the Boeing 737 Max. The U.S. and other countries grounded the planes following two fatal crashes in just six months.

The Dallas Morning News found that pilots of the Boeing 737 Max 8 voiced safety concerns to federal authorities, with one captain calling the flight manual “inadequate and almost criminally insufficient,” months before the first crash last October.

The Maneuvering Characteristics Augmentation System was included on the Max 8 as a safety mechanism that would automatically correct for a plane entering a stall pattern, The Dallas Morning News explained. If the plane loses lift under its wings during takeoff and the nose begins to point far upward, the system kicks in and automatically pushes the nose down.

We don’t know yet what caused those crashes. But you can imagine that if a pilot wasn’t expecting that, if he or she didn’t have the right training on how the system would behave in that situation, their reaction could make the problem worse instead of better.

“The pilots seemed kind of taken aback by the system taking over,” The Dallas Morning News reporter Cary Aspinwall said on the NPR program All Things Considered. “They weren’t sure what it was doing. And I think that shocked the pilots.”

So, whether the Tesla Autopilot or the Boing Max 8’s latest automated safety tech, in both cases it looks like the operators needed more training on how a system is supposed to work.

Are your drivers fully up to speed on all the latest safety technology you’ve put in their trucks?

0 Comments