Fleets as Test Labs
Real world testing by savvy fleets makes for better products.
December 2007, TruckingInfo.com - Cover Story
"To us, the only real result is running it in the fleet in enough quantity to get enough data over a period of time to see if it really works," says Bruce Stockton, vice president of maintenance and assets at Contract Freighters Inc.
He recalls a tire test the company did several years ago. A tire vendor came to them with projections on a new tire that, if true, would make it a no-brainer to switch tires. "So we ran a few, and the early data told us the same thing — it held up really well," Stockton says.
But after further testing, they discovered that although the tire wore very evenly and was extremely durable, a higher rolling resistance caused a trade-off in fuel economy that negated the tire's other advantages.
There is perhaps no better illustration of the importance of real-world fleet testing of products under development than the truck engines designed to meet 2004 emissions regulations.
Because of a lawsuit alleging that engine makers had put "defeat devices" on earlier engines, the deadline for implementing the new models was moved up 15 months, to October 2002. As a result, those engines did not have as much fleet testing behind them as the engine makers and the trucking industry would have liked. Once fleets finally got their hands on them, there were complaints about excessive under-hood heat on some models, fuel mileage that was worse than expected, turbos and EGR valves breaking down, and other problems.
Many of these issues were eventually addressed, but if there had been enough fleet testing, most likely would have been fixed before the engines were put into full production.
At Detroit Diesel, for instance, the number of miles they were able to accumulate in real-world fleet testing for the 2002/2004 engines was a fifth of what they did for the 2007 engines, according to Chuck Blake, senior technical sales support manager.
Engines are only one of the many products that are tested by fleets, either before they go into production, or once they are on the market. Tires, anti-idling options, bypass filters, electrical components, fuel mileage enhancers, safety technology, wheel seals, mobile communication systems, telematics, fleet management software — just about any component or accessory that goes on a truck or is used in managing a fleet is tested in the real world.
In the IT world, it's known as "beta testing" — forward-looking customers test-drive a product and help the supplier get the bugs out before it is released to the general public.
For instance, J.J. Keller introduced FleetMentor, a web-based management tool targeted at small fleets, in October. Fleet managers have been invited to go to the site, www.fleetmentor.com , to sign up for a beta test of the new site beginning in January. According to Jacqui Jurmu, design manager for the site, the beta test will run for four to six months. After the test, fleet users will get a 30-day free trial before the subscription service begins. The beta test will allow site designers and editors to iron out any wrinkles that crop up as real fleets use the service in day-to-day operations .
While those not in the IT world may not call it beta testing, the concept is the same — get the product tested by real users in the real world.
"There's no way you can imagine every thing a driver can put something through and then duplicate that in the lab," says Walter Madsen, beta testing manager at Xata Corp., which offers fleet optimization tools featuring both in-cab hardware and web-based software.
The fleet experience often catches situations that supplier testing did not, says Detroit's Blake. Although an engine is tested on stands in test cells, on test tracks and on test trucks operated by the truck and engine makers in a variety of conditions, "real world miles always add a new dimension."
For instance, he explains, for the 2007 engines, they spent months testing the engines in the desert, at high altitude, and in the extreme cold of a Canadian winter. But in fleet testing, a group of trucks got caught in a once-in-a-century snowstorm in Denver and idled for three days, plugging diesel particulate filters. "Nobody ever dreamed they'd be idling for three days," he says. "It was almost comical." But it exposed an area that could be improved before the production date.
Florida Power & Light has been testing hybrid electric utility trucks for several years. In the initial testing, FP&L compared three hybrid trucks with three baseline trucks with the same specs except for the hybrid components, all operating in the same area using the same telematics for a better comparison. In addition to discovering how much fuel the units could save in different types of duty cycles, the pre-production testing gave the utility the opportunity to troubleshoot the early units and sort out issues with operator error and equipment reliability. The early results were favorable, so they put more units in the fleet to test. Last month, International announced it was entering production with a hybrid the utility had tested.
"It's important (for manufacturers) to get that pre-production testing under their belts and get it tweaked to the hilt," Blake says, "so when it comes out the door, it's good."
And, of course, getting test products into major fleets is a marketing strategy, as well — "try it, you'll like it." If a fleet likes what they see during the testing process, they're more likely to buy the technology when it comes on the market. "If we put our system on a truck, we will have a sale," says Jerry Cook of Ecotech, which makes a device it says improves fuel economy.
The Inside Track
Being a tester of pre-production products offers several advantages to fleets. The biggest one is that if the new technology works well, you're among the first to reap the rewards.
"You've got a higher confidence level in making a change to your truck or trailer specification," CFI's Stockton says.
Pre-production testing also gives fleets a better idea of any challenges they may face in adopting the new product or technology, whether it's additional maintenance procedures, lower fuel economy or driver acceptance.
"Some products, it's hard to get drivers to accept," says Dean Newell, vice president of safety and training, who tests a lot of new safety technology in his job at Maverick Transportation, "but if they don't, then I know what kind of sales job I would have to do if I want to implement it in the fleet."
Carl Tapp, vice president of maintenance at P.A.M. Transport, tested 2007 engines on some trucks on a route between Irving, Texas, and Brownville, Texas. They put a lot of miles on the trucks, running them seven days a week with team operations, in a high-temperature environment.
"We've ordered 2007-engine trucks, and we didn't have the unknowns that a user would have today if they were going out to buy them (without testing)," he says.
Fleets also like to know that their input is making a product better.
For instance, Xata added PTO functionality to its latest version. While the feature worked exactly as expected in the test fleets, these beta testers gave them feedback allowing them to improve the reporting feature, allowing fleets to monitor active PTO time versus idle time.
That sneak peek at what's coming down the pike doesn't come for free, however. While most suppliers provide the tested product for free and may help in providing maintenance, there are other issues.
"It's not all a no-brainer on the aspect of the fleet owner," says Puradyn President and COO Kevin Kroger, who also was involved in fleet testing for years at two major engine manufacturers. "There are some requirements on his side and some management constraints on his side as far as that piece of equipment is concerned."
For one thing, pre-production units are not quite ready for prime time, so they may break down more frequently.
"In the worst case, the component may cause progressive damage to other components or systems on the truck," says Dennis Damman, director of engineering for Schneider. "So we try to evaluate the risk before agreeing to any evaluation or testing."
Or the supplier may want you to pull the truck off the road so they can evaluate how the product is doing, or make a change to it.
"Sometimes it's a pain in the butt," admits P.A.M's Tapp. "Sometimes you have to pull a truck out of service for a period of time. They want to overhaul it at a certain time and they can't get the right parts. It's a logistical nightmare for us sometimes to get the right truck to the right place at the right time."
Even if everything works perfectly, it takes time and manpower to accurately run and monitor testing. You need to be able to keep track of which trucks have test components on them, and maintenance personnel have to know what they're supposed to do if they fail. You may need to track consumables such as fuel and oil that were used on the test trucks.
"If you lose a product in test, it makes the test worthless," Tapp says. Currently, he's testing a prototype 300-amp-output alternator for one of his suppliers. "If it fails on the road, our road rescue department would be aware it was on the truck, and they have instructions on what to do if it fails."
Fuel economy testing is probably the biggest headache of all. It is very difficult to account for all the different variables that can affect mileage — equipment specs, speed, load, route, weather, driver, tire wear, cross-winds, etc.
"It is very difficult and expensive for fleets to perform controlled testing," says Bob Weber, chief engineer for International.
One of the problems, he says, is relying on data from the engine's electronic control module for fuel mileage. "What the engine's trying to do is estimate the amount of fuel consumed, so there are always accuracy errors."
There are some very specific test protocols that are used in fuel economy testing. SAE Type III testing, for instance, involves short runs, about 40 miles, using portable fuel tanks. A Type IV test, Weber says, is a more real-world test, but still involves a set, fairly controlled route of about 500 miles. It takes longer to run and to understand the results than the Type III test. Several large fleets have set up these types of test runs, he says. U.S. Xpress, for instance, has a route in the Chattanooga area they use for testing several times a year.
Some of these challenges prompt OEMs to do a sort of hybrid test — using real-world equipment provided by fleets, but handling the actual testing themselves. For instance, ABF Freight System allowed Detroit Diesel to conduct controlled fuel economy testing of the new 2007 engines using its equipment for a couple of days. "It gives them an opportunity to find out if their engineering's working and kind of do some real-world testing," says Rick Preston, director of maintenance. "It showed us how our specific application was affected. It made us think about maybe changing some things and taking a little different approach with the newer engines."
Depending on what's being tested, the actual test protocols could vary greatly. The duration of a test can vary from a day or two to several years. SAE Type III fuel testing may take just a day. A software update might take a few months; software with an entire new functionality might take a year. Testing tires for wear over the life of the tire could take two or three years.
Fleets may be asked to test products at various phases in the development process. For instance, at Qualcomm, new products are tested with fleets in three stages, says Chris Silver, senior manager, product marketing, Qualcomm Enterprise Services: "As the product is being developed (through prototypes) to understand how the drivers interact with the mobile device and how the applications/services can best be developed to help improve safety and usability. Technical testing (alpha) — from a technical perspective... does the product work? Does it work as expected? And pre-release (beta), to ensure market readiness."
Picking and Choosing
Fleets that become known for testing get approached frequently by all types of companies.
"We get calls letters, e-mails, and personal visits from people wanting us to test something at least once a week," says CFI's Stockton. "The type of testing we've done is typically driven by an area that we've identified as a problem or a high-cost area for us."
As Schneider's Damman puts it, "There needs to be a business case for the product ... an opportunity for payback or driver enhancement or a future mandate requiring implementation of technology."
Obviously, the new low-emission engine platforms are a high testing priority. "There's a huge cost associated with those, and the technology is mandated, so we need to get as much information on those as we can, including complete fuel economy and reliability tests," Damman says. Other products that get their attention include emerging safety technology, products that improve driver conditions, and products that may offer a fuel economy or other cost advantage.
Gary Edelman, director and co-founder of Engineous USA, which makes an oil additive it claims improves performance and mileage, says when companies such as his want fleets to do testing for them, "you have to show them the financial reward that would come from using the product; it has to be pretty clear."
Not everyone wants to test products that are not close to being ready for the market. Some fleets don't like to test pre-production because "they feel it cheats them of the real story," Blake says.
Fleets that are testing products in development have to realize that these engines or components are not going to be exactly the same as those that will roll off the production line — in some ways they may be worse, in some ways better.
For instance, Blake says, for some of the 2007 engine testing, they did not have exactly the same cooling packages. In some cases it was bolted to the engine instead of to the frame rails. During pre-production, EGR coolers were mostly hand-made. When they went into production, the factory-made ones weren't as good as the hand-made ones used in pre-production testing.
Many fleets also do extensive testing of existing products to determine which products will work best in their fleet: Do collision-warning systems improve safe driving habits? Will single-wide tires save money? Which tires work best in my fleet?
"We have tested some things for companies, like the new 2007 EPA engines," says P.A.M.'s Tapp. "But a lot of the tests are just curiosity to see how we might maintain a competitive edge.
"For example, we put PSI (tire inflation systems) on several hundred trailers and ran them for an entire year, and ... we found out we had a definite advantage in tire costs. We tried automated shifting transmission products, like the Meritor Freedomline and the Eaton AutoShift and UltraShift, and we looked at what that did for saving brake wear and driveline damage and all those things, and now we're standard with automated shifting transmissions in our fleet."
What Makes a Good Test Fleet?
So, you think you'd like to try testing some products? First, you have to keep good records.
"You have to be able to stay on top of it," says Maverick's Newell. "You can't just put it on and forget about it. You have to be willing to spend the time with the project."
At P.A.M., Tapp says, they already use a detailed computer maintenance management system. "We've got a great computer system as far as mileage and fuel used and all those kind of things, so it's not a job for us to keep very accurate records. We put a product on test, and we record that in the system, and then it's easy to retrieve that and check the history."
Suppliers are going to want to know about breakdowns and malfunctions. A fleet that scrupulously managed breakdowns, for instance, is going to be valuable as a tester.
"If they're running say, four or five engines, and all four or five experience a stupid failure like a dipstick tube failure around the same time, that's really good data," Blake says. "You can say, 'What's weird about this installation — is it something the truck is causing, or is it something on the engine?'" and address the problem.
Size certainly is a factor. A single truck does not make for a scientifically valid test sample. Depending on the product and the test, you might want anywhere from five to 50 units devoted to the test.
"You need a certain size test sample," Tapp says. "Like a tire test — if you're not testing several hundred tires at a time, you're wasting your time. If you're not testing 20, 30, 40 trailers at a time, you're wasting your time. If I put one thing on one truck and had miserable results on one truck, that wouldn't be fair. But if I tested it on 20 trucks, and that looked favorable, I might want to expand the size of the test."
For instance, at CFI, they're in the process of testing aftermarket tire pressure monitoring systems to see which one might work best for their fleet.
"There were about three different manufacturers of tire pressure monitoring that we interviewed and we got them on a couple of trucks each," Stockton says. "We narrowed it down to one we think is a real candidate, and now it's time to put it on probably 50 trucks and see what it does. Probably within a year we'll be positioned to really understand and know what our return on investment could be."
Bob Wessels, retired from Caterpillar and now a consultant, notes that fleets need to be large enough to "put large samples 'in play' — i.e., 50 units to test an item or spec change idea — because of the complexity of the statistics and accuracy of results surrounding small sample sizes."
That's not to say it's impossible to do any sort of product evaluation within your fleet. For instance, at P.A.M., they decided to test a new style fifth wheel with an easy-pull feature. They put it on a single shuttle driver's truck that picked up and delivered trailers all day, every day. "We put it on her truck for a year, and it held up and she liked it," Tapp says. "We were going to pull it off (after the test) and replace it, but she said, 'If you pull that thing off I'm going to quit.'" The test was deemed a success.
Location also can be an issue when setting up a test. For instance, early testing of 2007 engines involved making sure there was a source of ultra-low sulfur diesel fuel. And during fleet testing of the engines, Detroit Diesel made sure the dealers and distributors in the areas and routes where the test trucks were running were the first to get training on the new engines.
"You try to pick the fleet so their operations are close to your field engineering office or your corporate offices so if there is some issue that takes place, you can immediately get your people out there to take a look at what went wrong," says Puradyn's Kroger.
Communication between the fleet and the supplier is key.
"I look for engaged management, engaged users, people that actually use the system," says Xata's Madsen. "I want someone with some enthusiasm, with some fire, who really wants to give me feedback."
You also need to be able to gather driver feedback, both for manufacturers and for your own evaluation. "Usually I pick guys I know will be honest with me and give me their true opinion," says Maverick's Newell. "You'd be surprised at how important they feel they are, and how much input they'll give you when they think they have a say-so on whether you're going to (buy) the product or not."
Schneider's Damman says while the ability to get instantaneous data via telematics helps in running tests, "you still need to get the driver's input. A lot of times you think you've got something the drivers are really going to like, and you find out you interpreted that wrong. That's something that telematics isn't going to give you."
Depending on the situation, suppliers or fleets or both may ask the other to sign a confidentiality agreement of some sort regarding the testing.
"Some fleets are better than others about keeping things confidential or publicizing it," Kroger says. "Those who aren't necessarily confidential with that information, they may get the product that is just about ready to hit the market. (Suppliers who) question whether the product is ready for the market, they give it to those guys who keep the information confidential."
Schneider's Damman notes that often, confidentiality agreements go both ways. "We don't want to put the effort and cost into a test program only to have our competitors get the data as soon as the evaluation is complete. You have to keep some of the information confidential so you can have at least a slight competitive advantage."
Sean Dorney, vice president of sales and marketing at GeoLogic, explains that when selecting a beta site for its products, "we look for a customer who is technology-savvy, has the resources to not only observe the product that is being tested but also has the ability to provide regular feedback and suggestions to us. Since our product goes on a truck, we also try to select a customer who has access to his vehicles every couple of days or so — we want to be able to get to the truck in a reasonable amount of time if necessary. We also prefer a customer who understands technology and is willing and able to be a testing ground for us — not all customers like the idea of testing new products."
If you enter into an agreement with a supplier to test a product, make sure both parties have a clear understanding of how the test will be conducted, and what the expected results are going to be within a set period of time.
"I'm a doubting Thomas," says P.A.M.'s Tapp. "Manufacturers have all these ROIs and stuff like that. I want to see what a product really does in real life, and in our operation."
(Jim Beach, technology editor, contributed to this article.)