Fleets increasingly rely on sensors and telematics capabilities to track driver behaviors. These include lane departure and collision avoidance systems, as well as monitoring hard braking, vehicle speed, and other such data. There has been a growing interest in augmenting these tools with video-camera systems. Cameras focused on the roadway, and sometimes on the sides of the vehicle or on the driver, can provide more context to the alerts other systems provide.
After all, hard braking could signal problems with a driver who is following too closely or driving distracted. But focusing on hard-braking events without the context of what was happening on the roadway at the time can lead to false positives. A video system can show whether, for instance, a car suddenly switched lanes and caused the driver to hit the brakes. Or, an inward-facing camera can identify driver behaviors that may have led to the hard braking, such as looking down at a cell phone or reaching for a drink.
A number of providers now offer such video systems. From established transportation management software and telematics providers to stand-alone camera systems, fleets have an array of options to choose from.
Most of these options are “managed” services that provide not only the hardware, but also sophisticated software that analyzes the video data, helping fleets identify driving skills and behaviors that contribute to on-road accidents.
Trends in Video-Based Safety
Among the new approaches in video-based safety systems is combining products “that have traditionally been on separate platforms into one solution, on one platform, using one cellular plan,” says Richie Howard, president and CEO of AngelTrax. A stand-alone system, AngelTrax combines elements of surveillance, event recorders, and safety systems, which allows managers to investigate events and provide the necessary coaching for drivers. Unlike some systems that only work when the truck is moving, Howard says, AngelTrax provides video of a driver’s entire trip, including not only on-the-road video, but also loading and unloading, pre-trip inspection, service activity, and backing views.
Stephen White, senior business development manager, Class 8 market, for Geotab, says by integrating various safety solutions into a comprehensive telematics platform, “fleet managers are able to gain greater visibility into both their drivers’ driving behavior and the external or situational conditions of particular road events.”
Systems such as the one from Samsara can be considered a “stand-alone,” says Ingo Wiegand, director of product management at Samsara. The system “can both capture the data and make it available” via a dashboard. But, like many other products, the system also can be integrated with other applications or a fleet’s transportation management system.
While many video systems can be integrated into a fleet’s TMS, most of the data generated currently goes to the safety department via a safety dashboard, and/or back to drivers via text messages or app-based scorecards. However, video data could be a “bridge” to close the operational divide between dispatch, safety, maintenance and human resources, says Chris Orban, vice president of data science at Trimble.
Some providers are using advanced artificial intelligence and machine learning technology “in the cab,” says Ryan Wilkinson, chief technology officer, IntelliShift. In-the-cab computing, also called edge computing, rather than sending data to a cloud-based server for processing, enables real-time alerts to aid in driver coaching.
Verizon Connect, for instance, uses AI to slot events into varying severity levels (e.g., collision versus near-miss) to help managers determine which incidents need more immediate attention. Users receive notifications to view a video within minutes of when a collision or other incident occurs.
Timble’s Orban foresees more use of video analytics in automated driver coaching. Many fleets already use these videos for coaching after the fact. Automating the process, he explains, would entail analyzing videos, applying machine learning to them, then creating a coaching video for the driver to use when not driving that shows him the good and bad of his most recent trip or trips.
New eBook on Fleet Safety
Get answers to your most nagging questions about Video Based Safety. Download “Myths vs. Reality” and learn the reality behind the myths.
In-Cab Camera Benefits
One of the leading benefits of video-based systems is that they help integrate “safety solutions into a comprehensive telematics platform,” says Geotab’s White. This gives fleet managers greater visibility into both their drivers’ behavior and the external or situational conditions of particular road events.
These systems can measure key safety metrics such as speeding, harsh braking, seat belt usage and other data points that help fleets “develop and implement custom safety reports, advanced collision avoidance systems and driver coaching sessions,” he adds. This leads to a variety of paybacks, including improved driver performance, reduced insurance and maintenance costs and minimized risk.
Jonathan Bates, head of global marketing for Mix Telematics, says current systems benefit each stakeholder in fleet safety – drivers, fleet managers and safety directors. Drivers can monitor their performance via mobile apps. Safety managers gain insight into which drivers need more coaching and where that coaching should be focused.
He says the back office analytics are as important if not more so than the videos themselves. The analytic component helps customers make sense of all the data collected. “You can’t just say we had 1,700 events for this driver; you need to understand what that means.”
Samsara’s Wiegand points to three reasons fleets adopt such technologies: protecting employees through better training, saving money by reducing claims for “at-fault accidents,” and improving a company’s overall safety program.
For Store & Haul, a bulk food carrier based in Van Wert, Ohio, the decision to deploy a video system from Samsara came down to building a robust culture of safety, explains company CEO David Rager.
“We needed more tools that could provide solid analytics and visual evidence of driving experiences to learn and grow from,” Rager says. “The cameras allow us to do just that. We use videos of not-so-good driving behaviors to coach our drivers – and in numerous cases we have evidence of heroic maneuvers from our drivers deploying defensive driving techniques to save lives.”
The key benefit Store & Haul has seen so far has been in accident and equipment damage causality, Rager says. In one case, after a car had side-swiped one of the company’s trucks, the auto driver retained an attorney and claimed the truck was at fault. “But as soon as we sent over the footage of what actually happened, the suit was dropped.”
Fleet Adoption of In-Cab Cameras
Video-based safety systems are becoming more accepted in the marketplace, but the level of acceptance is uneven. Adoption rates for video-based systems come down to “how you break up the market,” says Jason Palmer, previously chief operating officer, SmartDrive, and currently responsible for integration at Omnitracs (which bought SmartDrive). “In Class 8, we are seeing a much stronger adoption rate,” with fewer fleets asking for pilot programs now than in years past. Instead, fleets are showing more interest in full rollouts.
For medium-duty fleets, the adoption rate is still relatively low, but Palmer and others say that market is one of the bigger growth areas.
In that Class 8 market, Trimble’s Orban says interest in forward-facing systems is high and that interest in driver-facing systems is growing. “For a long time, the adoption rate was pretty close to zero” for driver-facing cameras. “Now, there is a bit more interest.”
That’s a big change, he says, explaining that customers are showing interest in driver-facing cameras as a means not only to improve driver behavior, but also to monitor driver fatigue.
As an example of the growing interest, we’ve recently seen companies add driver-facing dashcams to their offerings. Verizon Connect launched a driver-facing dashcam as an extension of its integrated video platform, which fleets can access from its Reveal platform. Eroads’s new Clarity Dashcam can be used in front-facing or dual front/driver-facing modes and captures HD video, sending 20-second video clips back to Eroad in real time when triggered by safety events or by the driver.
While acceptance for outward-facing systems is growing, however, there is still significant push-back from drivers on driver-facing cameras. Geotab’s White thinks that “mostly stems from the misconception that implementing connected technology is a form of disciplinary action and an invasion of driver privacy.”
Clear, well-documented company policies and “complete transparency” from fleet managers about why the systems are being deployed and how they are to be used can help here.
Store & Haul found that “almost no drivers” had problems with outward-facing cameras, but the inward-facing cameras were a little bit more of a challenge. Its approach, which other fleets have also reported using with success, was to meet one-on-one with each driver before the cameras were installed.
“We walked them through exactly how the cameras worked, why we were implementing them, and gave them as much time as needed to ask questions and get answers,” Rager says.
Veteran drivers were given the option of asking for the inward-facing camera to be covered after a 30-day trial. Eighteen of 31 opted for the cover. For new drivers, however, both outward and inward-facing cameras are mandatory at Store & Haul.
Another trend, Orban says, is more interest from fleets in cameras with more capabilities such as low-light and high-definition recording.
Safety Event Triggers
Most video systems record constantly while a truck is powered on, but the video is only saved and uploaded to the provider or fleet when triggered by a crash or an event such as hard braking, lane departure, or following too closely. Many systems also allow a driver to push a button to manually capture the most recent video.
More sophisticated systems use software and artificial intelligence to help determine when these events matter. For instance, if you trigger on hard braking without the capability to discern when that correlates to unsafe driving, “you end up with false positives on 90% of those,” says Omnitracs’ Palmer.
Vendors offering more sophisticated systems work at improving their trigger mechanisms. Omnitracs, for instance, is combining the trigger event with information coming from the engine, computer vision information, and other data to determine if what triggered the system is actually unsafe driver behavior.
While hard braking is a common example of an event that triggers the system to save the video, lane-departure is by far the most common trigger, Orban says – to the extent that many customers choose not to activate the cameras for lane departures. That may be a good choice, he says, as providers continue working on ways to determine when lane departure is due to driver distraction, and not due to a situation where a driver departs a lane because that’s what road conditions require, as a result of road work or a stopped vehicle. Videos often show that kind of situation, where the driver is doing so in a safe manner.
A number of trigger events can be traced to driver distraction, says Mix Telematics’ Bates, something he considers more of a problem than driver fatigue. The main problem for providers, however, is being able to understand what “distraction” really is. There is more work to be done in this area, he says.
In addition to cell phone use, “there are a lot of distractions,” Bates notes, including reaching down for something when the driver may take his eyes off the road for a second or two. But it gets complicated. “Reaching down is a series of events, and it’s almost like you have to build a more nuanced view of distraction.”
That might include compiling relational data that can break out individual parts of a specific distraction. For instance, smoking a cigarette is considered a possible distraction, but some parts of that may be more distracting than others. Video data may help in figuring out a way to score the relationship between reaching for a cigarette, lighting it, and flicking the ash into an ashtray, that would help better quantify each of these actions in terms of their distraction potential.
While AI offers promise in this area, that is still no substitute for the human brain. Orban recalls an experienced fleet safety manager who, after looking at three anonymous videos, was able to identify a driver distraction, a new driver, and a false positive.
That is the challenge video providers face, he says – creating software capable of applying that sort of experience and know-ledge to every truck and driver in the fleet.
In the final mix, cameras in the truck are but one tool fleets can use to improve safety and minimize risk. The key, as always, is how fleets use the data these tools provide in coaching drivers to be safer and more productive.