Vehicle Sensors Go Longer Range

New sensors of all types look farther out and provide higher resolutions for engineers pushing ADAS capabilities and higher-level automation.

Automotive sensors generally are designed to identify objects at varying distances. (ZF)

Sensors are the frontline technology for advanced driver-assistance systems (ADAS) and future vehicles with high-level (SAE Levels 4-5) automation. Designers at all levels are working to find the optimal and number of sensors and their ideal performance levels, capturing data that’s farther away while increasing field of view and resolution.

Tier 1s and OEMs have long leveraged different sensors – radar, cameras and lidar – to provide redundancy and gain insight into what’s around the vehicle. The first high-level automated vehicles for public use will be heavily laden with sensors as engineers and developers strive to prove that driverless vehicles can navigate safely. Automated taxis and shuttles are being loaded with input devices.

Radar can help cameras monitor front, sides and rear. (Texas Instruments)

“When we look at Level 4-5 robotaxis, ultimately they’ll be at 30 or more,” said Andy Whydell, VP of systems product planning for ZF. “Some of the challenging situations are slow movement when you’re maneuvering the vehicle.”

The latest generation of ADAS, mostly considered to be at the high end of the SAE’s Level 2 classification, generally use several radars and cameras. Both the number of sensors and the technologies being leveraged will grow as automakers move to higher automation. Even suppliers of cameras and radar generally agree that most vehicles will move from one sensor modality to multiple versions of varied sensor types.

“For [the so-called] Level 2-plus ADAS, we are seeing five radar sensors being required to achieve the ADAS functions – two in the rear corner, two in the front corner and one in the front,” said Prajakta Desai, marketing manager for Texas Instruments mmWave automotive radar. “For Level 3, additional sensors on the side would be needed for 360-degree coverage. For Level 4 and beyond, we believe that all the sensing modalities (vision, radar and lidar) might be required to achieve fully autonomous driving.”

Sensors galore

Sensor developers are working to stretch sensing systems’ distance capabilities. (Continental)
Some sensors are using a new MIPI standard for data transfer. (STMicroelectronics)

Most front-facing sensing systems rely on a combination of sensor types. Though it’s possible to provide full autonomy with a single technology, combining cameras, radar and lidar provides some redundancy while also adding complimentary sensing capabilities. “None of the available sensor technologies – be it camera, lidar, radar or ultrasonic – will be able to realize automated driving functionalities on their own” asserted Arnaud Lagandré, VP of the ADAS Business Unit, Continental North America.

“Furthermore, we need to understand redundancy as sensors use different physical principles. Adding more forward-looking cameras will not help when you are driving directly towards the sunset, they will all be equally blinded. Considering the top three sensor technologies currently in use (camera, radar, and lidar), you would always need to have at least two different physical principles operational to ensure safe sensing of the environment in any complex driving scenarios.”

Sensor suppliers are continuing to expand their capabilities, extending distances and improving resolution. Automakers utilize CMOS imagers like those in phones, but most automotive-grade cameras remain in the 1-2 megapixel range. However, higher-resolution automotive cameras are beginning to ship.

“The most widely discussed challenge within CMOS imaging sensors is to make the sensor with smaller pixel and higher performance targets, such as higher resolution, and to make sure they operate smoothly in extreme temperatures,” said Andy Hanvey, director of automotive marketing, OmniVision Technologies Inc. “As the level increases, the resolution increases to 8 megapixels in order to see farther distances.”

Changing channels

Camera makers are increasing resolution for automotive-grade imagers. (Omnivision)
Sensors combine with satellite positioning inputs for the specific guidance requirements of most agricultural vehicles. (Deere)

Radar developers are adding more channels to boost resolution and increase sensing distance. When channel counts double, the number of virtual channels soars significantly, giving systems greater ability to determine distance and identify objects. “Radars have gone from a 2x2 channel format to 4x4, so there’s four times higher resolution,” said Martin Duncan, ADAS division general manager at STMicroelectronics. “The four transmitter and four receivers communicate independently and can communicate at different times, creating massive numbers of virtual channels.”

These higher-resolution sensors are critical for advanced ADAS and high-level automated vehicles, experts note. At highway speeds, they offer more time to determine what’s ahead, while at city speeds, the same radar devices can provide wider fields of view. “There’s a lot of growth for long-range, 4D high-resolution imaging radar, high-performance radars with more signal channels than you find in most mass-market passenger cars today,” Whydell said. “Our long-range radars have a distance of up to 350 meters. At high speeds, the beam is narrow; at slower speeds, energy shifts to a “flood,” more with a wider field of view.”

Tiny is terrific

Engineers want the benefits that come from using multiple sensors, but design stylists don’t want sensors to mar their sleek exterior lines. Those competing concerns put pressure on sensor designers, making package size even more critical than it’s been in the past. That concern ripples out to wiring harnesses that connect sensors to controllers.

Radar size has reduced dramatically over the past several years, driven in part by declining prices and a shift to higher frequencies. A few years ago, 24-GHz modules that cost well over $100 were fairly common. But 77-GHz devices that cost in the realm of $50 now are mainstream. That helps increase distance performance while trimming both size and cost. “Moving to 77 GHz gives you smaller packages so more can be squeezed in – people are also adding the antenna in the basic package,” Duncan said. “The circuit boards used for radar are expensive, so making them small reduces the price significantly.”

Certified sensors with longer sensing distances will hasten autonomous-vehicle development for on- and off-road environments. (Volvo)
Andy Whydell, ZF’s VP of systems product planning. (ZF)

When packaging engineers devise ways to trim size and lower cost, engineers often clamor for more components. Small packages are easier for stylists to hide. Tiny packages also can be housed in areas like headlights and by rearview mirrors, where they’re less likely to get dirty or be covered by snow.

“We see a trend to integrate sensors in small spaces on vehicles, such as door handles or headlights, for better coverage,” Desai said. “Antenna-on-package technology enables extremely small form factor that removes small space constraints and enables sensor integration into new places enabling newer applications and functions inside and outside the vehicles.”

Wiring is another critical issue that arises the proliferation of sensors that generate large data streams. CAN doesn’t have the necessary bandwidth, while Ethernet requires bulkier and more costly cables. That’s prompted many engineering teams to deploy a standard used in many phone cameras: the MIPI-A-PHY standard, developed by the MIPI Alliance, is gaining acceptance as a sensor link.

“Most sensor transmissions are largely mono-directional, so you don‘t need to go to Ethernet,” Duncan said. “A new protocol, MIPI-A-PHY, can be quite efficient. It uses cheap cables, they’re about a third the cost of Ethernet cables.”

Commercial vehicle challenges

Commercial vehicles share many of the same challenges as passenger vehicles, but they also have additional requirements. Reliability and longevity demands increase and operating conditions in agriculture and construction are vastly different. Autonomous vehicles have operated in mining and some other closed environments for some time. In other off-highway fields, automated systems aid operators – but don’t replace them. In agriculture and construction, sensing ranges often are relatively short.

“We expect most of our challenges to come in close range and involve signal-blocking structures like trees, silos and storage sheds,” said Nancy Post, director, Intelligent Solutions Group at John Deere. “As a result, our next-generation receivers will have technology that mitigates most, if not all, of the scintillation and interference issues that are common today.”

Driving-related sensors on these vehicles often will be relegated to secondary status, because companies like Caterpillar and Deere have their own satellite-positioning technologies. “Our primary sensor is the satellite navigation system, which typically doesn’t have the same issues with inclement weather,” Post said. “During short periods where it may be affected, we can fuse the positioning information for the additional sensors like inertial navigation systems, camera and imaging radar to bridge these gaps.”

On highways, reliability over the long haul is an essential factor. Commercial vehicles have demanding environmental prerequisites that are challenging suppliers’ capabilities. Getting sensors that can “see” far enough away to safeguard vehicles that haul heavy cargo payloads is not easy.

“Sensors’ range is quite important for some applications, and we are always looking for sensors that are certified for safety purposes,” said Luca Delgrossi, head of technology, Volvo Autonomous Solutions. “Today, there are a few certified sensors and the distance they can cover is still relatively short. Lidar’s reliability is one of the factors that determines how fast we can drive under safe conditions. Different sensors can handle adverse weather conditions in different ways. It is important to understand specific sensors limitations and avoid running operations when the system cannot operate safely.”