Panasonic Augments Driver Safety with New HUD Tech
The 2.0 system’s simulated 3D provides an intuitive awareness of the situation and surrounding environment ahead of the vehicle.
At CES 2022 Panasonic Automotive Systems Company of America unveiled AR HUD 2.0 (Augmented Reality Head-Up Display 2.0), the first system to include a new, patented eye-tracking system (ETS). If you’ve ever thought about what exists beyond the limits of a HUD and the small rectangular box it displays on the windshield, welcome to the world of AR.
And be careful because AR is not Virtual Reality. VR is a space in which headsets or special glasses allow the wearer to experience a 3D world that doesn’t exist except in this technology. It’s increasingly used in automotive interior design. Panasonic’s AR HUD expands that small viewing rectangle far beyond the windshield out into the real world, placing information into your world that will not change your world but will enhance it.
For example, it’s raining and you can’t see a traffic light. AR can add that stop light to the field of view, explained Andrew Poliak, Panasonic Automotive Systems Company of America’s CTO, “2.0 is the patented ETS technology that identifies the individual driver’s height and head movement behind the wheel and dynamically adjusts and compensates the images in the ‘eye box,’” Poliak said.
Drivers constantly shift their head and change their line of sight, but with parallax alignment and dynamic autofocus working together (above) in the Panasonic HUD system, drivers will see only accurately positioned, crisp, high-resolution overlays and icons. The driver’s experience is powered by AI (Augmented Intelligence, a sub-category of artificial intelligence) navigation software which intelligently matches the changing environment with 3D AR overlays, icons and mapping, providing an intuitive awareness of the situation and surrounding environment.
The 3D is simulated using a tilted, dual-image plane – providing 3D at what Panasonic claims is the cost of 2D. As Poliak explained: “Today we are just mapping the virtual world over the real world.” Even in its latest iteration, AR HUD 2.0 is a benign technology. It is not connected to a vehicle’s ADAS (advanced driver-assistance system). Instead, it enhances the driving experience and helps drivers make more intelligent decisions regarding navigation and safety related issues, the company said.
However, as the auto industry moves to more automated driving, Panasonic’s technology will evolve to integrate with a vehicle’s ADAS. What makes Panasonic’s AR HUD 2.0 technology different? According to Hans Troemel Jr., the company’s advanced engineering vision & sensing group manager, “We are unique in having experience right down to the component level with every technology required to design these systems. Our breath and our depth are a big advantage over the competition.”
Panasonic employs a user-experience group that surveys customers for answers to such questions as how much and what types of information do customers want displayed and when? “For example,” Troemel continues, “Driver distraction is a major safety issue today. Based on feedback we’ve received we are starting to enable machine learning functions in the vehicle so that the system learns what info the driver wants displayed and when. Reducing the cognitive load is critical to the driver. We can customize the experience for each driver. We’ve gotten a lot of good feedback from our user-experience groups telling us that once you experience this technology , you really don’t want to go back.”