Veoneer Shows Its Latest Safety and AV Tech
Collaborating with partners, the Swedish Tier 1 is innovating in key areas of sensing and systems integration.
Veoneer CEO Jacob Svanberg gazes at a large flat-screen display from a demonstrator van’s second-row seat. The screen shows hundreds of colored dots, including a dozen purple pixels pinpointing a black rubber tire laying horizontal on pavement about 100 meters (328 ft.) from the vehicle. “This is the first time I’m seeing this demonstration, and it’s super-impressive. Imagine the possibilities if camera and radar data were added to this high-resolution lidar data,” Svanberg said.
As advanced driver assistance systems (ADAS) morph into SAE Level 3 and higher levels of automation, sensing technology is ever more important. “Next-generation vehicles will have more higher quality sensors, more compute power, and more advanced systems to provide more features,” Svanberg said. That will require the smart fusion of cameras, radar and lidar. For Veoneer, a core focus area is the system integration, he explained, and thinking beyond individual components.
Veoneer is reportedly being prepped for sale by private equity firm SSW Partners, which acquired Veoneer from Autoliv in 2018. Analysts said SSW could sell the entire business to one buyer or consider divesting the active-safety unit (radars, cameras and lidar) and the restraint-control systems unit (electronics for airbags and seatbelts).
SWIR innovation
SAE Media accompanied Svanberg during a recent ride-and-drive program originating from the Sweden-based company’s Southfield, Michigan, offices. The experience involved several demonstrator vehicles, each equipped with current and/or future production technologies. In the lidar proof-of-concept demonstrator vehicle, detection of low-light reflecting objects (like the black tire) are a production priority.
The global safety systems Tier 1 is developing next-generation ADAS and autonomous-vehicle (AV) technologies. In partnering with Australian start-up Baraja, Veoneer looks to develop AV applications for Baraja’s patented Spectrum-Scan lidar technology. “By modulating the wave form of the laser, we can control the angle that the light comes out of the prism. This is all done in solid-state with no moving parts,” explained Matthew Landry, a Baraja field application engineer. He noted that the wavelength-tunable laser is capable of providing up to 2000 lines of discrete vertical resolution. The current leader for automotive high-resolution lidar is the Velodyne VLS-128, with 128 laser beams.
To complement its portfolio of camera-based products, Veoneer has been collaborating with Israel’s TriEye to develop a scalable shortwave infrared (SWIR) sensor. “Today’s sensing technologies can use additional support in heavy rain, thick fog, sun glare and other adverse weather conditions,” said Tobias Aderum, Veoneer’s director of research and innovation. To date, SWIR has only been used in military and aerospace applications and has not been feasible for automotive due to the high cost of indium gallium arsenide (InGaAs) and other exotic materials, added Ziv Livne, TriEye’s chief business officer.
That situation could change with what TriEye claims is a world’s-first solution: a silicon-based sensor that can cover the SWIR spectrum — wavelength bands (typically ranging 900-1700 Nm) not visible to the human eye. TriEye’s imaging and accurate ranging occur simultaneously in one sensor system.
Sensor fusion
As new ways to address ADAS with exterior vehicle sensing move along the R&D path, Veoneer technology specialists are also investigating different ways to achieve in-vehicle communications. A demonstrator vehicle with a 60-GHz radar sensor packaged within the B-pillar enables detection of a pet, an adult or a child in the road ahead via biometrics, such as the breathing rate.
“The radar is sensing position, motion, doppler frequency, angles and size, which is all information that can be used to determine where and what’s in the vehicle,” said Chris Van Dan Elzen, Veoneer’s executive VP for radar products. That detection capability provides a framework for writing algorithms to trigger an alert, such as a child remaining in the back seat after the rear doors have been closed. “We hopefully can stop fatalities, like those caused by a temperature spike with a child or a pet left in a locked vehicle,” Van Dan Elzen said.
Another Veoneer demonstration vehicle featured a graphical user interface display for 11 different metrics. “We can determine how a driver is performing in each of the metrics and provide pointers for improving,” Mitchell Pleune, Veoneer machine learning engineer, said about the in-development “driver scorecard”. Still in concept stage, the product relies on different types of sensors, including a driver monitoring camera, two 60-GHz radar sensors, two ethernet cameras and stereo vision.
“All of the sensors are in production today,” Van Dan Elzen noted, “but this demonstrator vehicle is showing how we can bring all of these sensors together to do even more for the driver.”
ToF camera tech
Driver distraction, one of the scorecard metrics, uses the sensing system to show where the driver is looking throughout a driving trip. “If there’s a problem that’s happening in a specific location, like not stopping at a stop sign because a tree is partially blocking the sign, the camera sees that and we can show that problem area graphically on a map,” Pleune said. A driver can view the overall score for all 11 metrics, or a specific metric score. Focus groups have noted that the driver scorecard could be especially helpful for parents and guardians of teenage drivers.
In-vehicle sensors also can be used to provide body location data, such as the driver’s head, shoulder, knee, elbow and wrist. That data can be used to tailor airbag deployment. The driver monitoring camera typically runs at 10 frames per second. “If automatic braking is happening because of a detected impending crash, the system could increase to 100 frames per second and give that information — the occupant’s distance from the airbag — to the airbag restraint system’s algorithm every 10 milliseconds,” Pleune said.
Knowing an occupant’s position matters. For instance, if the occupant is leaning out the window, not deploying the airbag might be the preferred option. “Having supplemental information would help to develop a more customized experience for a specific incident,” he said.
Veoneer’s in-development Time of Flight (ToF) camera would serve as a supplemental information provider. The camera sends out pulses of light into the vehicle cabin and then precisely measures those pulses of light reflecting off cabin objects to determine the distance objects are from the camera, Van Dan Elzen explained.
Active-safety restraint approaches could benefit from the ToF camera. “This really opens up a completely new area to explore when it comes to deploying airbags in a smarter way,” Svanberg said. The ToF camera and other technologies underscore the company’s solutions-based approach. “It’s what we’ve always done and it’s what we’ll continue to do,” he said.
Top Stories
INSIDERManned Systems
Are Boeing 737 Rudder Control Systems at Risk of Malfunctioning?
NewsPower
Off-Highway Hybrids Are Entering Prime Time
INSIDERAerospace
Designing Next-Generation Carbon Dioxide Removal Technology for Better Life in...
INSIDERWeapons Systems
Barracuda: Anduril's New Software-Defined Autonomous Air Vehicles
NewsManned Systems
Truck OEMs Invested in Infrastructure
INSIDERWeapons Systems
Webcasts
Automotive
The Testing Equipment You Need to Keep Pace with Evolving EV...
Automotive
Advances in Zinc Die Casting Driving Quality, Performance, and...
Automotive
Fueling the Future: Hydrogen Solutions for Commercial Vehicle...
Defense
Maximize Asset Availability in the Aerospace and Defense Industry
Aerospace
The Inside Story on Space Grade Silicones
Automotive
A Quick Guide to Multi-Axis Simulation and Component Testing
Similar Stories
ArticlesTest & Measurement
Thermal Imaging Makes Its Case for ADAS, AVs
Technology ReportPhotonics/Optics
New Osram IREDs Enhance Vehicle Vision, less Harmful to Human Eyes
ArticlesAutomotive
NewsUnmanned Systems
Simulation Developer RFpro Mimics Vehicle Sensors
Q&ARF & Microwave Electronics
Q&A: Owl AI Making Better ADAS Systems with Help from Drones