Current-generation lidar can detect the tilt of a pedestrian’s head that might indicate they are about to step off a sidewalk into traffic. - Photo: Waymo

Current-generation lidar can detect the tilt of a pedestrian’s head that might indicate they are about to step off a sidewalk into traffic.

Photo: Waymo

We’ll call a human a truck driver after he or she completes a two-week training program, yet it takes tens of millions of miles and billions of dollars before we turn a computer loose at the controls of that same vehicle. Is driving a much more complex task than we imagine, or are computers just slow learners?

It’s a bit of both. Human drivers have the advantage of being able to easily adapt to situations. The computers that pilot trucks, on the other hand, literally have to be instructed through written rules in lines of code.

For example, a common maneuver such as letting a car pass the truck before the truck driver initiates a lane change and follows in behind the car doesn’t take a lot of thought for a human drive. But the computer must have written instructions on how to make such a decision.

“Trying to write that down in rules based on how many meters per second each vehicle is traveling is really hard,” says Brad Newman, the head of Waymo’s motion planning software engineering team. “It’s about interaction and intent.”

In Waymo’s case, Newman says, his team has drawn a lot of insight about understanding intent from the team at Google Home, an app that lets you control Google Nest, lights, cameras, speakers, and more in the home. (Waymo is a subsidiary of Alphabet Inc., the parent company of Google). 

“We’ve been able to use some of those kinds of concepts to reason about intent — not just trying to predict what will happen in the next 10 seconds, but what the [other party] wants to do. And then, how will they respond if I go in front, or if I wait and go behind?” Newman says.

How autonomous trucks see the world is critical to safety. Sensors detect distance, direction, and speed. The artificial intelligence provides the context. - Photo: AEye

How autonomous trucks see the world is critical to safety. Sensors detect distance, direction, and speed. The artificial intelligence provides the context.

Photo: AEye

Human drivers hardly think twice about such a move. Yet their reasoning is based on imperfect data. The human driver can, at best, estimate the closing speed of the overtaking vehicle and make a “best guess” about whether there’s enough space and time to make the move. The computer driver, on the other hand, knows precisely how fast the car is traveling, and how many seconds — to a couple of decimal places — it will take the car to close the gap.

One might accuse the computer of overthinking the maneuver. The irony is the software engineers are, through artificial intelligence, trying to infuse the computer with “data-based intuition” rather than pure logic so that it behaves more like a human — but without the human propensity for error. 

Newman says Waymo’s latest generation lidar and camera systems provide enough resolution to enable the autonomous driving stack, called the Waymo Driver, to make educated guesses about the intent of a pedestrian or even another driver by analyzing the movement of limbs or the direction the face is pointing.

“Using what we call “key points” on pedestrians, we can see which direction a person is facing, or if they’re leaning toward the path of the vehicle or are they kind of leaning back and relaxed,” he says. “We can see signals that give us hints as to their intent, and we can use that information to get a much more nuanced driving experience around pedestrians.”

According to Waymo, the use of key points (which typically correspond to body joints) and pose estimation has been advancing in many industries for more than a decade, from animating cartoons and creating realistic video game characters to augmenting reality on popular social media apps. In the driving scenario, key points help the

Waymo Driver understand and recognize partially occluded objects, such as just a leg or arm of a person stepping out of a vehicle or a person hidden between two vehicles, and reason about their next move.

Visual Perception Technology

Nothing Silicon Valley has so far been able to assemble even comes close to the perceptual capability and situational-analytics potential of human eyes and brains. Our eyes can instantly focus on and discern objects at distances from a few inches to many miles. And our brains, through evolution and life-long learning, can interpret the visual input and determine pretty accurately what’s going on around us. Decisions are made based on interpretations of what the eyes see, and we don’t give it a second thought.

In lieu of eyes and brains, automated vehicles use an array of sensors to gather information on their surroundings, including cameras, radar, lidar (light detection and ranging), and thermal imaging or infrared sensors.

In some ways, these arrays are better than our eyes, because the combination can see in the dark and through fog or smoke, can accurately measure distance, and can even sense the body heat from an unseen animal or person standing along the road.

Recent advances in all these technologies can now provide accurate data to the autonomous driver software, from over 500 yards away to as close as a few feet, in bright sunshine or in total darkness. By fusing the input of cameras, radar, and lidar sensors, engineers can create an accurate representation of where the truck is on the road. The technology also has the ability to see far enough down the road in sufficient resolution to begin making decisions that could become critical a few seconds later — for example, in determining exactly what an object detected on the road ahead actually is, from a plastic bag to a crate or a shredded tire. 

Lidar works by sending pulses of laser light to determine the presence, shape and distance of objects, often in great detail.

“Lidar really shines at distances of 150 meters and beyond,” says Andrew Nelson, vice president of trucking platforms for lidar supplier AEye. “That distance provides some margin for making the right decision about whether to slow down or make a lane change when an object is detected.”

Continental's HRL131 high-performance solid-state lidar mounts on the roof of the tractor facing forward. Additional sensing devices, including radar, a camera, and short-range lidar, provide...

Continental's HRL131 high-performance solid-state lidar mounts on the roof of the tractor facing forward. Additional sensing devices, including radar, a camera, and short-range lidar, provide the rest of the picture.

Photo: AEye

Nelson says AEye uses a software-defined architecture for its lidar that can vary its field of focus depending on the speed of the truck and the conditions. At slower speeds, the field of view is 120 degrees horizontal and 25 degrees vertical so everything close to the truck is visible. At higher speeds, the lidar is focused to 15 degrees horizontal and 5 degrees vertical. At 300 meters, that’s a 20-meter height, so you can still measure the height of an overhead structure, for example, or recognize a vehicle stopped on the shoulder or in a traveled lane, even on a curved road.

“In our case, the same lidar can shift from one use case to the next,” Nelson says. “You can think of it as a camera auto-focusing and changing from a wide to a narrow field of view.

“This is not technology for technology’s sake,” he adds. “It’s actually a functional use case that trucking platforms currently use, and they don’t want to pay for any more lidar than what’s absolutely needed.”

AEye’s 4Sight Lidar is solid state, meaning there are no revolving components as found on traditional lidar units. It also transmits and receives on separate channels, which the company says helps reduce interference from other light sources, such as the sun when it’s low in the sky.

Another claimed advantage to solid-state lidar is the possibility of increasing the frame rate from 10 Hz (frames per second) to 15 or 20 Hz. 

“With no rotating mechanism, we don’t have to worry about centripetal forces or vibration when increasing the speed of the rotating mass,” Nelson says.

Other companies are also producing solid-state lidar, but rotating lidar sensors are still widely used, especially in close-range applications such as the sides and rear of the truck.

Cameras and Radar

With the resolution improving in lidar almost monthly and costs coming down fairly dramatically, it’s becoming the primary forward vision source on many autonomous truck platforms. But cameras and radar are also improving. Smaller and more sensitive camera sensors are emerging. These allows for better imaging and comparisons with the lidar images “seen” by the computer. Radar, of course, isn’t really optical. It does a poor job of resolving objects as images. But it’s unsurpassed at measuring distance and speed, which is a necessary part of the fused suite of “vision” technologies used in on-highway autonomous truck demonstrations.

But what about low-speed operations on essentially closed courses or in controlled spaces, such as container ports?

Cameras, radar, and lidar work together to provide a 360-degree view of the truck’s surroundings. Data can be filtered to prioritize certain areas of the truck from a wide-angle view at low speeds...

Cameras, radar, and lidar work together to provide a 360-degree view of the truck’s surroundings. Data can be filtered to prioritize certain areas of the truck from a wide-angle view at low speeds and long, narrow view at highway speeds.

Photo: Waymo

Lidar isn’t needed in these applications, says Ognen Stojanovski, co-founder and chief operating officer at Pronto.ai.

“We don’t use lidar because of the problems and the cost of maintaining accurate high-definition maps," he says. "Our system is very software-heavy, artificial-intelligence-heavy and hardware-light.”

Pronto’s main sensor is a camera. Radar is used in some cases, but you won’t find lidar on a Pronto truck. The company isn’t targeting the Level 4, driverless, long-haul market that is the focus of many other autonomous-truck developers, but instead other more limited use cases that are commercially viable, scalable, and can be deployed today.

Stojanovski sees immediate deployment opportunities in places such as ports, where drivers’ time is wasted in long lines at the gates, or in drayage operations on short routes, and possibly on private roads, moving containers from the cranes to the railhead, for example. That sort of use case may not be as sexy as long-haul, but he says there’s incredible inefficiency in those operations that could be at least partially resolved by taking the driver out of the truck.

Port-Queuing Demo

The Virginia Tech Transportation Institute partnered with Pronto to demonstrate how automated driving systems, or ADS, could be safely deployed in port queuing at the Port of Oakland, California. Following a four-month period of testing, evaluating and tweaking, the demonstration project went live with a week of live-streaming video from the Pronto truck delivering containers around the port.

The Department of Transportation and the Federal Motor Carrier Administration, which were involved in the project, concluded in a research brief that “the Pronto system operated flawlessly negotiating heavy traffic and intersections.”

Pronto’s camera/GPS equipped truck has a light load on the roof. The camera is positioned in the center of the array between two precision GPS receivers. - Photo: Jim Park

Pronto’s camera/GPS equipped truck has a light load on the roof. The camera is positioned in the center of the array between two precision GPS receivers.

Photo: Jim Park

However, the demonstration was not without hitches.

“Initially, the ADS system was already proficient at traversing the routes of the different queues but was unable to handle the speed and the hostility of other drivers,” the report notes.

The system was a bit more conservative than human drivers and was slow to resume moving as the line progressed. This opened gaps in the lines where other drivers would try to cut in. 

“If the ADS-equipped vehicle was driving too slow at restarting motion when the queue started moving, it would be a target of aggressive honking and yelling by human drivers,” notes the brief.

The base algorithms were adjusted to reduce the transition time from stopped to moving, and the adaptive cruise control following distance was adjusted to maintain a tighter gap between leading vehicles. After that, everyone got along just fine.

The truck navigated the port roads and lineups using just cameras, radar and GPS, following predetermined routes on a breadcrumb trail.

The Pronto truck navigates a work zone following a GPS breadcrumb trail laid out by the construction crew. The truck will follow exactly the same path each time. - Photo: Jim Park

The Pronto truck navigates a work zone following a GPS breadcrumb trail laid out by the construction crew. The truck will follow exactly the same path each time.

Photo: Jim Park

“We call it computer vision, meaning we use the camera as the main and often only sensor to feed our neural networks and run our machine-learning models,” says Stojanovski. “The other sensors include communications technology to be able to get instructions, and also very accurate GPS with centimeter-level precision.”

Automated Driving Systems in the Real World

Most of the automated trucks we read about are operating in the sunny Southwest, where weather is seldom a concern. That might easily lead one to think the lidar-radar-camera vision suite used in those applications will be the leading candidate going forward. However, this past winter, Pronto successfully tested its lidar-less vision system on a fleet of trucks operating between Calgary and Edmonton in Alberta, Canada. The weather was far from perfect, yet the trucks ran every day.

Pronto has tested trucks in 49 of the 50 states (not Hawaii) and seven Canadian provinces. It has even run a trip totally under ADS control from San Francisco to Prudhoe Bay, Alaska, using just cameras and GPS — without the benefit of high-definition maps.

Systems using cameras rather than lidar see the world much as humans do. Direction comes from GPS and object detection, and if equipped, from radar. - Photo: Pronto

Systems using cameras rather than lidar see the world much as humans do. Direction comes from GPS and object detection, and if equipped, from radar.

Photo: Pronto

“We wanted to test our technology and know that we had a pathway to deploying it in those environments,” Stojanovski says. “You might build something perfect for Arizona, but if it doesn’t work elsewhere, you’re pretty limited.”

He sees great potential for commercializing Pronto’s product in remote locations — mining, logging, oil and gas exploration and more — where traditional roads are nonexistent and drivers are very difficult to find.

But history shows us that real innovation usually follows in the steps of competing technologies. Take two-versus four-stroke engines for example, or gasoline versus diesel fuel. Sometimes clear winners emerge, but often there is a melding of two or more technologies that ultimately serves us best. And of course, any technology is constantly evolving.

Waymo’s Newman points to improvement in radar and camera technology. These aren’t the spinning radars that show you a little blip on a screen. He describes then as very long-range imaging radars that provide a ton of dense, very precise information about the world.

Radar doesn’t provide an accurate image of the road ahead, but it’s very good at object detection, including identifying the speed of the object and the distance from the truck. - Photo: Waymo

Radar doesn’t provide an accurate image of the road ahead, but it’s very good at object detection, including identifying the speed of the object and the distance from the truck.

Photo: Waymo

“We also have added some new sensors,” he says. “The fifth-generation system now has thermal cameras, which can see not just in color, but can also pick up heat signatures in an image. We have also seen big jumps in the lens-cleaning systems. We’ve been able to kind of improve the quality and consistency of those systems overall.”

AEye’s Nelson has been working with lidar and cameras in the ADS space for 18 years. He’s familiar with the technology and confident in its capability. When it came to applying what he knows to a seemingly basic task for a truck driver — keeping a truck within the lane in a curve — he was surprised how challenging that was for an automated driving system.

“Keeping a 53-foot trailer centered in a lane in a 10- to 15-mpg crosswind is, well… just let me say drivers are very well-skilled,” he says. “Everybody worries about trucks drifting out of their lane, but we have to keep it plus or minus eight centimeters (about 5 inches) from center. With tolerances like that, accounting for wind and all that is absolutely critical.”

You can accomplish that using high-precision GPS, as well.

As with the zero-emissions technologies currently bursting into the market, lidar-equipped and lidar-less trucks will likely both find a place in the market. And these are still early days for ADS technology, especially with the rapid advances we’ve seen in the past few years.

What Will The Next Five Years Bring in Autonomous Tech?

“I’d be surprised if in our lifetime we ever get to fully autonomous trucks — no steering wheel, no clutch, no brake,” Nelson says. “I don’t see a path for that to happen. But we can get highly automated trucks. And within five years, we’ll be able to use those on interstate or uncontrolled environments.”

He also stressed the need for top-of-the-line equipment that is well-designed and well built. There’s a lot of margin in selling lidar today, and lots of companies are doing it, he says, “but the path to getting it on trucks and being robust is going through an experienced Tier 1 automotive supplier that understands the commercial market.”

Nelson says AEye plans to begin production on a million-mile lidar system for trucks in 2024 through a licensing arrangement with Continental in Germany.

Trevor Milton, the founder and former CEO of Nikola, once told HDT that the trucking industry expects a windshield wiper motor to last at least a million miles. While he had other missteps, Milton got that much right — trucking will never settle for technology that won’t survive the rigors of the road.

How Can AI in Autonomous Trucks Mimic Human Interactions?

You can bring all the technology in the world to bear on the problem of steering and gearing driverless trucks, but they don’t operate in a driverless world. Interactions with human drivers occur constantly — and if the examples from the Port of Oakland are any indicator, there’s still some work to be done.

Human drivers are conditioned to expect other drivers to perform in “predictable” ways, though the exact reactions to certain situations aren’t always predictable or pleasant. Lane cutters, for example, usually incur the wrath of the cuttee. Drivers who are too slow or over-cautious get their share of dirty looks. That means ADS-controlled trucks are going to have to blend in pretty seamlessly, lest they be subject to a chorus of blaring horns and single-finger salutes.

“That’s probably one of the top issues that my team is working on right now,” says Andrew Nelson, vice president of trucking platforms for lidar supplier AEye. “It’s about how to manage those nuanced, contextual traffic interactions.”

It’s not just about waiting for there to be a big enough spot and moving over, but actually signaling intent and creating gaps. “That’s where some of our latest machine learning approaches and planning have really helped," he says.

Because ADS trucks function with a high degree of predictive analysis, they will be “expecting” human drivers to perform in at more or less predictable manners. It’s probably fair to say the machines have their work cut out for them.

About the author
Jim Park

Jim Park

Former HDT Equipment Editor

Jim Park served as Heavy Duty Trucking's equipment editor from 2006-2024. Specializing in technical and equipment content, Park is an award-winning journalist who has been covering the trucking industry since 1998.

View Bio
0 Comments