New Roles for Lidar in Autonomy’s Lower Levels
Once considered a sensor necessity only in fully automated vehicles, lidar is infiltrating ADAS and safety suites as the driverless future continues to recede.
Laser-based light detection and ranging (lidar) sensors were only recently thought to be applicable within the realm of fully automated vehicles (AVs). The spinning lidar “flowerpots” that grace the tops of AV prototypes, providing 360° point-cloud views and unmatched resolution are considered a vital sensing component of the driverless future. As that future continues to recede in the face of technical and regulatory hurdles, lidar’s unique properties will migrate to lower-level advanced driver-assist systems (ADAS), with ADAS and other features projected to benefit from lidar’s acuity.
Typically part of a sensor suite that also leverages radar and cameras, lidar has unique characteristics that provide improved capability and redundancy. Unlike camera-based systems, lidar makes its own light and therefore requires no illumination to operate. Compared to radar, its main advantage is improved precision. Most manufactures consider 360° lidar a necessity for fully automated systems, helping provide complete real-time environmental awareness to safely operate a vehicle. But targeted lidar sensors already are being exploited to provide additional safety and conveniences apart from high-level autonomy.
Committing to ADAS
As the timetable for high-level autonomy stretches, many lidar developers have been caught in non-production scenarios. “There are about – and I lost count a long time ago – maybe 57 lidar companies out there, 20 of which are startups, and we're one of these,” explained Dr. Jun Pei, CEO of Cepton Technologies, a supplier of lidar solutions founded in 2016 and based in San Jose.
“Over the past years, tons of these companies were focused on autonomous vehicles, which never really happened,” Pei noted. “And now people are just looking for an alternative application. One of them is coming down to a lower level because the OEMs started to realize that lidar from the get-go is just an extra safety device to enable very low levels of automation, not necessarily a fully autonomous vehicle enabler.”
“We basically aimed for this ADAS project from the get-go, and all our products are designed for it,” Pei said of Cepton’s early commitment to targeted lidar applications. “It’s become clear to us: Lidar will be introduced in the mass automobile industry, but first as a safety device. It will bring the safety level to a higher standard, whether it's for moving, slow-going or fast-moving scenarios. There is a solid trend – people have realized that lidar as a sensor will not first be proliferated through a Level-4 vehicle. It will be proliferated through a Level-3 or even a Level-2.5-ish type of application.”
All about redundancy
According to Pei, as the industry accepted lidar simply as another safety device, it followed that it would be integrated into sensor portfolios. “It's the redundancy in the technology that propelled the OEMs to think of other things we can add into the sensor suite to make this big moving object, i.e. the car, safer than before,” Pei said. “If you have a camera, in order to defeat the camera, you cannot just have two cameras. That's not redundant because you can just have a heavy fog and none of the cameras will work.”
The alternating strengths of each sensor type are what contributes to a widening of the safety envelope, in aggregate. “Then you put in radar,” Pei said. “It won't be affected by the fog and it will give you reliable information, with some limitations. I am actually a true believer that these three sensors, camera, radar and lidar – redundant sensors of complementary technologies – are going to be coexisting in the near future in all cars.”
Looking vs. touching
Rather than develop 360° lidar systems, Cepton focused on its forward-looking sensors. “What we have is a frontal-view lidar with a limited field of view. In layman's terms it works like a camera,” Pei explained. “These are the modalities that we adopted to design our sensor. Kind of like the headlamp, we put most of the sensitivity at the very front and it slowly tapers off to the side.”
According to Pei, imaging and radar sensors can ably serve driver-assist duties most of the time. When those two types of sensing fail, however, it could equate to dire outcomes. “If you have cameras and radars, you will actually be able to take care of 99% of the scenarios. Lidar is only for that 1%, or maybe only 0.1%,” Pei figured. “Unfortunately, that fractional percent, as you hear from many of the autonomous-vehicle companies or software developers, are the corner cases that actually kill people.”
The additional but essential capabilities derive from lidar’s light-based format. “Lidar has self-illumination, it doesn't worry about whether there's sunlight or that it's completely dark,” Pei said. “It measures the physical response from the object instead of using AI to get what it is. It sends a pulse and listens to the return pulse, so there's physical interaction between the object and your own sensor. It's the difference between looking and touching.”
The other crucial feature of lidar is its precision. “The angular resolution between lidar and radar is about one order magnitude or higher for lidar. Both radar and lidar can see there's an obstacle in front, but lidar can see it much clearer,” Pei said. “Radar will tell you there is a car stopped 200 meters away from you, but it does not know in which lane. A lidar will tell you there's this car stopped, and it will tell you whether it's in your lane or on the shoulder.”
The next obvious step for lidar is to have the technology begin to be adopted by the OEMs for near-term uses. Cepton has partnered with Japan’s Tier-1 headlight supplier Koito Manufacturing, one of the world’s largest OEM headlight providers, which led a $50 million Series C funding round for Cepton in February 2020. “We have a deep partnership with the biggest headlamp maker in the world,” Pei said. “Naturally, the intention is for our lidars to be buried inside the headlamps so you have a very nice front view with plenty of redundancy for safety reasons.”
According to Pei, working with its partners, Cepton intends to see its products integrated by OEMs looking for the additional capabilities and redundancies inherent in lidar sensing. He noted that in the near term, lidar-equipped vehicles will be readily available. “A few years from now, you'll see a car from a dealership that you can buy with our lidar inside. That's a huge deal for us. It's not some demonstration vehicle from an autonomous fleet run by a startup. It's actually a car in a dealership for everyday people.”
For the OEMs, as form factors shrink and costs decline, lidar becomes another viable sensor option. According to a spokesperson at FCA, it’s the rapid pace of lidar development that has most automakers considering the implementations: “The technology is advancing very quickly. FCA is studying lidar technology within the L3 parameters and beyond. Lidar delivers data that can help the automated driving system build a more detailed understanding of the vehicle's surroundings.”
Craig Stephens, director of controls & automated systems at Ford, said he sees similar opportunity for integration. “We remain interested in all sensor modalities. What the vehicle can perceive about its surroundings defines what it can do,” he explained. “Redundancy – having multiple sensors providing the same information; and diversity – exploiting measurement capabilities that provide different capabilities – are both factors in deciding which sensors we pick for a given application.”
“Advances in radar and particularly computer vision are exciting and there is a tremendous amount of ongoing innovation in lidar development, both from established companies and start-ups,” Stephens said. “That work is increasing performance for a given price point and so lidar remains under consideration for driver-assistance technologies.”
Where lidar will likely make swift inroads is in low-speed driver-assist features such as automatic parking and valet-like vehicle retrieval. With vehicle speeds greatly reduced in parking-lot environments, sensor-fusion agility and therefore compute power requirements and costs drop dramatically. Speaking recently with AE on automated-parking systems, Alexander van Laack, vice president of sales, Faurecia Clarion Electronics, described the scenario that is likely to be consumers’ first brush with full autonomy.
“This is a low-speed level of autonomous driving because the car would drive across the parking lot with nobody inside. We will use cameras and ultrasound, but we also use the lidar sensors to detect moving objects, pedestrians, and so on,” van Laack explained. “With the integration of the lidars, the cameras and other sensors like ultrasound sensors, you have a good mix that you can leverage for understanding your surroundings, react very quickly and basically give you the eyes and ears.”
University of Rochester Lab Creates New 'Reddmatter' Superconductivity Material...
MIT Report Finds US Lead in Advanced Computing is Almost Gone - Mobility...
Airbus Starts Testing Autonomous Landing, Taxi Assistance on A350 DragonFly...
Boeing to Develop Two New E-7 Variants for US Air Force - Mobility Engineering...
PAC-3 Missile Successfully Intercepts Cruise Missile Target - Mobility...
Air Force Pioneers the Future of Synthetic Jet Fuel - Mobility Engineering...
Leveraging Machine Learning in CAE to Reduce Prototype Simulation and Testing
Driver-Monitoring: A New Era for Advancements in Sensor Technology
Electronics & Computers
Tailoring Additive Manufacturing to Your Needs: Strategies for...
How to Achieve Seamless Deployment of Level 3 Virtual ECUs for...
Specifying Laser Modules for Optimized System Performance
Volvo CE Previews ConExpo 2023 Display
ArticlesManufacturing & Prototyping
Low Distortion Titanium in Laser Powder Bed Fusion Systems