Steering Toward Full Vehicle Autonomy
Radar scene emulation helps bridge the gap between simulation and real-world testing of autonomous systems for SAE Level 4 capability.
The vision of fully autonomous vehicles (SAE Levels 4 and 5) is fast approaching. Making this vision a reality requires automotive OEMs to move beyond the current levels of vehicle autonomy to deliver on the promise of highly efficient transportation systems, more driver freedom and improved passenger safety.
While road testing is a vital part of the development process, the cost, time, and challenge of repeatability makes relying on real-world road testing alone unrealistic. Using this approach, it would take hundreds of years for vehicles to be reliable enough to navigate urban and rural roadways safely 100% of the time.
Additionally, current in-lab test solutions deliver an incomplete view of driving scenarios and mask the complexity of the real-world due to their limited horizontal or vertical field-of-view (FoV), minimum simulated distance, and number of echoes per simulated target capabilities.
Keysight Technologies recognized this problem in 2021 when it introduced its Radar Scene Emulator, enabling OEMs to lab-test complex, real-world driving scenarios while accelerating the overall speed of test. The company has kept its technology apace with current gaps. The full traffic-scene emulator combines hundreds of miniature radio frequency (RF) front ends into a scalable emulation screen representing up to 512 objects and distances as close as 1.5 meter (4.9 ft.).
On the path to Level 4
Shifting testing of complex driving scenarios from the road to the lab accelerates the speed of testing. For OEMs who need to realize new advanced driver assistance systems (ADAS) functionality – and for ADAS/Autonomous driving (AD) developers who value safety first — the Radar Scene Emulator and Keysight’s Autonomous Drive Emulation platform provide a high level of resolution.
Road testing of the complete integrated system within a prototype or road-legal vehicle enables OEMs to validate the final product before bringing it to market. This system is helping to advance the development of better trained algorithms, with the goal of greater overall safety on the road.
The most recent data suggests self-driving cars could reduce traffic deaths by as much as 90% (Fig. 1). ADAS in current production vehicles have reached SAE L2 and L3, which in most traffic situations require the driver to control the vehicle. Many OEMs and industry experts believe pushing further toward SAE L4 and L5 autonomy — where five represents vehicles not requiring any human interaction — will make our roadways safer.
To achieve the next level in vehicle autonomy, many advancements are required. There will be massive investments in sensor technologies, such as radar, lidar, and camera which will continue to improve environmental scanning. As each sensor type has its own advantages and disadvantages. They will need to complement each other to ensure the object detection process has the required built-in redundancy.
Huge investments in computationally powerful software algorithms are also necessary to combine and carry the large amount of high-resolution sensor data including vehicle-to-everything (V2X) communication inputs. Machine learning (ML) is the established method for training self-improving algorithms and artificial intelligence (AI). Those algorithms are then making decisions to ensure safety in complex traffic situations. Training these algorithms with the most realistic stimuli available, in a repeatable and controlled fashion in the lab, is crucial for their accuracy and their safe deployment.
The testing/simulation gap
Today, a large amount of testing time is spent focused on sensors and their control modules (ECU) by simulating environments in software or software-in-the-loop (SIL) testing. Road testing of the completely integrated system within a prototype or road-legal vehicle allows OEMs to validate the final product before bringing it to market. Recreating a virtual world in the lab, with accurate rendering of the scenes, plus real radar sensors and signals, will bridge the gap between simulation and road testing.
The challenge today is the emulation of full radar scenes, especially when the scenes are complex and have many variables. The goal is to thoroughly test all driving scenarios in the lab, even the corner cases, before bringing the vehicle to the test track or open roadways.Software simulation is used in the early development cycle. The software ultimately is an abstract view, and it has imperfections and limitations.
Relying only on real-world road testing is also unrealistic because it would take millions of meters for vehicles to become safely reliable to navigate in urban and rural roadways 100% of the time. To truly test the AV/ADAS functionality, it is necessary to control all relevant parameters.
To close the gap between real world testing and simulation, real and physical sensors are needed in the test setup. This complexity must be added to the test to predict how AVs will behave on the road.
The vision is for technology to fully replace the human behind the wheel to enable reliable, accurate, and safe decisions on the road. Software simulation cannot fully test the real sensor response and testing on the track is not repeatable.
When emulating radar targets, several technology gaps currently exist:
- Limited number of targets and FoV: A common approach ties each simulated target to a delay line. Even if additional targets are added, only one radar echo is processed at a time. Also, if an antenna array is created, it isn’t possible to simultaneously emulate targets at the extreme ends of the radar module’s field of view. In addition, each movement of the antennas introduces a change in the echo’s angle of arrival (AoA), which might lead to errors and loss of accuracy in rendering targets, if not recalculated.
- Inability to generate objects at distances of <4m: Many test cases, such as the New Car Assessment Program’s (NCAP) Vulnerable Road User Protection — AEB Pedestrian, require object emulation very close to the radar unit. Most of the target simulation solutions existing on the market today are designed for long distances.
- Lower resolution between objects: Until recently, target simulators could only generate one object as one radar signature – this leaves gaps in scene details. For example, on a crowded multi-lane boulevard, test equipment must accurately tell the difference between all the traffic participants. With only one echo per object, the algorithm might not be able to tell the difference between a bicycle and a lamp post.
New technology needed
Full-scene emulation in the lab is key to developing the robust radar sensors and algorithms needed to realize ADAS capabilities on the path to full vehicle autonomy. One method is to shift from an approach centered on object detection via target simulation to traffic-scene emulation. This will enable the ability to emulate complex scenarios, including coexisting high-resolution objects, with a wide field of view and a reduced minimum object distance.
Real life vehicles are in fact extended objects, whose dimensions span multiple radar sensor resolution cells. As a result, the radar sensors report multiple detections of these objects in a single scan. Object tracking algorithms are employed to identify and cluster these detections into objects. Realistic emulation of such objects by means of multiple radar echoes enables pushing the boundaries of the algorithm’s and correctly identifying complex objects on the road.
The sensor’s entire FoV must be covered to achieve high test coverage and run comprehensive test scenarios. A wide FoV is needed, ideally with RF front ends that are static in space, to enable reproducible and accurate AoA validation. Similarly, it is important to test the radar sensor’s ability to notice object’s heights and traffic on multi-layered roads. A practical example — the autonomous vehicle must not apply the brakes when another vehicle is crossing on an overpass, but correctly detect the overpass and drive underneath it instead.
Realistic traffic scenes require the emulation of objects very close to the radar unit. For example, at a stoplight where cars are no more than two meters apart, bikes might move into the lane or pedestrians might suddenly cross the road. Passing this test is critical for the safety features of an ADAS/AD.
Object separation, the ability to distinguish between obstacles on the road, is another test area for a smoother and faster transition to L4 and L5 vehicles. For example, a radar detection algorithm will need to differentiate between a guard rail and a pedestrian while the car is driving on a highway.
Greater confidence in ADAS functionality
More targets, shorter minimum distance, higher resolution, and a continuous field of view are essential to real world testing. In the lab, this will enable an increase in test coverage to not only save time, but safely run and repeat test scenarios.
A traditional radar target simulator (RTS) will return one reflection independent of distance while a radar scene emulator increases the number of reflections as the vehicle gets closer, also known as dynamic resolution. This means the number of objects varies with the distance of the object.
AD and ADAS software decisions must be based on the complete picture, not only on what the test equipment allows. New radar emulation technology is one more way to shift testing of complex driving scenarios from the road to the lab. To learn more, visit here .
Silviu Tuca is the radar-based autonomous vehicle product line manager for Keysight Technologies. He is an expert in RF electronics, Biophysics, and calibration tech.