ZF Advancing ADAS, Autonomy with Next-Gen Sensors

ZF's OnTraX assist offers further capabilities with the addition of short-range radar, including lane change and city drive assist. (ZF)

Together with its many partners, ZF supplies camera and radar technology and advanced components for both the passenger car and truck markets, the latter being especially suited for the move to more complex driver-assistance systems, according to Dan Williams, director of ADAS & Autonomy at ZF. “The business case in commercial vehicle for reduction in driver hours of service, fuel cost reduction and safety have strong economic incentives to adopt ADAS/automated driving technology. Additionally, the regulations placed on the industry will require our customers to utilize certain solutions,” he said.

Dan Williams, director of ADAS & Autonomy at ZF. (ZF)

ZF is working on both highly automated “revolutionary” systems and on “evolutionary” driver-assistance systems that are increasingly complex, he said, citing the supplier’s OnTraX lane keep assist that will launch in 2020 with its first major OEM customer. Williams spoke with TOHE at the recent NACV Show in Atlanta, and he’s scheduled to participate in a Commercial Vehicle Safety technical session at the SAE Government/Industry Meeting  taking place January 22-24, 2020, in Washington, DC.

Which industry will lead with the integration of automation systems?

One very reasonable prospect might be passenger car, which has a lot of scale and a lot of money to invest in R&D that’s definitely required for these very expensive systems to develop. But passenger car has their own problems—they’ve got very diverse and sometimes very complicated duty cycles, or we’d say operational design domains…The opposite extreme is off-highway, like with automated mining trucks and other [machines] in remote areas. All of these off-highway examples are very low volume, very particular to a given site—they require a lot of engineering without much volume. We would say that commercial vehicles are kind of the Goldilocks scenario for automation, where things are just right. There’s more concentrated commercial-vehicle activity in fewer specific use cases that are more simply automated. Two-thirds of our vehicles spend more than 95% of their time going straight down the highway at the speed limit, maintaining the lane. I don’t want to undersell that—that’s still a very difficult thing to automate, but it’s far easier to automate than some of the very strange urban-environment scenarios that passenger cars can get themselves into.

ZF has partnered with Ibeo and ams to develop solid-state lidar sensors that provide complete 3D imaging of the vehicle's environment and a precise perception of complex traffic situations. (ZF and Ibeo)

What can we expect from ZF in the next year or two?

In 2020, we’ll be launching our next generation of sensors that will support increasingly complex ADAS functions. By that I mean these new camera and radar sensors will have a longer range, they’ll have a wider field of view, and they’ll have higher resolution. All these things taken together will allow them to do any number of things, probably most significantly is to allow us to do a better job of detecting pedestrians and other stationary and semi-stationary objects. Apart from that we’re working with NVIDIA and Ibeo on components that power even higher levels of technology and full automation. [In May 2019, ZF announced a partnership with ams and Ibeo Automotive Systems to develop solid-state lidar sensor technology.]

Can you elaborate on these next-gen sensors?

The next-generation radar is going to be operated at a higher frequency, at 77 GHz [vs. the current 24-GHz sensor], and that will do a better job of detecting slow-moving, sort of stationary objects—it’s not really the ‘soft tissue’ as much as the ‘slow moving’ that causes problems. And the camera, it has kind of a dual mode of operation, where it’s got a narrower field of view that extends longer for on-highway operation at high speed, and then the camera and the radar have a wider field of view that they can go into at slower speeds. The trade-off is shorter range, but at slower speeds you really don’t care. And then a new sensor we’ll be adding in this next generation is a short-range radar that can be placed on the side of the vehicle to detect bicyclists and pedestrians. That in concert with these forward-looking sensors gives us a more complete view not only in front of the vehicle but all the way around the side.

Where do these next-gen sensors position you on the SAE levels of automation?

They’re evolutionary, really. I think we’re going to be very well positioned in 2020 for the L2 market, and we see that as a real sweet spot; we think that’s going to be around for a while. Drivers are going to be driving these vehicles for some time, and if we can use some of this technology to improve their productivity, to increase the safety, that’s going to have value for quite a while.

What’s the timeline for L4 autonomy?

That’s a tough question. Anything that I’d say would be further out than what you hear from Silicon Valley. [Laughter] There’s a lot of work to get this going in Arizona, New Mexico, Texas, and there’s going to be even more to get it beyond that into more challenging duty cycles, in snowstorms and rough weather up north. But it’ll happen, there’s no doubt it’s going to happen. It’s just a matter of when.

What are the challenges to get to L4?

A lot of it is just having the ASIL-qualified sensors, and that brings in redundancy. You can pile a bunch of sensors together that have a lower ASIL (Automotive Safety Integrity Level) to get to what you need for safety-critical functions. But that adds expense and that’s not really an elegant solution. The sensors that really satisfy the functional safety requirements, basically it’s lidar, and the old joke is, it’s two years out and it’s been that way for the last five. There needs to be some breakthroughs in highly reliable sensing technology to be able to do that.

What’s the role of powertrain in ADAS and autonomy?

Powertrain will be increasingly integrated into more complex ADAS functions on the way to autonomy. An easy thing to describe is platooning. Everybody suspects that as we shorten the following distances with platooning we can increase the fuel savings. When you shorten the following distances, you need to more tightly control the powertrain. You need knowledge of grade information and stuff like that and feed that into the powertrain. It’s a fairly difficult problem to be able to smoothly start and stop these heavily loaded vehicles on a grade, that challenges powertrain control—to have them in an automated way to back up and very gently kiss the loading dock. That requires a lot of control.

What’s ZF’s position on cameras replacing mirrors on trucks?

We’d like to do it, for sure. We are working with our passenger car people that are involved in the regulatory affairs with this action. We think it’d be a good step for the industry to be able to replace the mirrors with rearward-looking cameras. It’s maybe a little bit easier to do in Europe right now based on the regulation. We’ve got a demo going on with a platooning project over there that actually does exactly this. We’ve got what we call ‘wings’ that come out of the vehicle at the top of the cab, and that’s where we put the V2V (vehicle-to-vehicle) communication between the platooning vehicles and we’ve also got rearward-looking cameras in there. It’s obviously something where we’ve got to see regulatory change over here before that can happen.