Audi Details Piloted Driving Technology
Before autonomous vehicles make drivers obsolete, electronic technologies will depend on people to make decisions when something unusual happens. During normal driving conditions, autonomous controls could pilot the vehicle, relying on humans when complex decisions are required.
Audi recently provided technical insight into its piloted vehicle project, in which an Audi A7 concept car drove from San Francisco to Las Vegas earlier this year. The vehicle drove itself most of the journey, though drivers had to remain alert to take over when alerts directed them to resume driving.
The concept car has a range of computers in the trunk. Audi engineers plan to reduce them to a single board over time. The mainstays of the piloted vehicle technologies are an array of cameras, radar, and ultrasonic sensors that are controlled by what’s called the zFAS board. It combines sensor inputs to give the car its view of the world.
“All raw signals from the sensors is collected in a sensor fusion box,” Matthias Rudolph, Head of Architecture Driver Assistance Systems at Audi AG said during the recent Nvidia GPU Technology Conference. “From that input, a virtual environment is created.”
Four semiconductors are the basis of the zFAS board. An Nvidia k1 processor collects data from four cameras and “does everything while driving at low speeds,” Rudolph said. An Infineon Aurix processor handles additional chores. Mobileye’s EyeQ3 performs vision processing, while an Altera Cyclone FPGA (field programmable gate array) performs sensor fusion.
The software architecture is layered, with the perception sensor programs forming the first layer. Above that, there’s a fusion layer that blends data from the sensors with information from maps, road graphs, and other sources. Rudolph noted that combining inputs provides better information and increases confidence in the analysis.
“Radar is not good at determining the width of a car,” Rudolph said. “A camera does that well. If we fuse data from each of them we get good information on what’s ahead.”
Ensuring that the zFAS boards detect potential threats and respond to them correctly without false alerts is critical. If vehicles stop or swerve to avoid something that isn’t a true danger, drivers are likely to stop using the system.
“If the car brakes and nothing’s there, it will destroy the confidence of the driver,” Rudolph said. “We have had no false positives; that’s been proven with over 10,000 hours of driving at an average speed of 60 kph (37 mph) in situations including snow and freezing rain.”
Audi looks at moving objects to analyze their potential impact given the vehicle’s driving path and speed. All stationary items are viewed with a single goal.
“We look at static images as the same,” Rudolph said. “It doesn’t matter if it’s a wall or a parked car, we don’t want to hit it.”
Pedestrians are a major challenge for all types of autonomous systems. They’re harder to spot and categorize than vehicles, and they have more degrees of freedom. The system uses a single monocular camera to search for pedestrians. Given the erratic behavior of some walkers, Audi doesn’t stop for pedestrians unless they’re truly in harm’s way.
“When we detect pedestrians, we compute the time to contact,” Rudolph said. “We’re close when the vehicle stops. We want to be close, just a few centimeters away. We do not want to stop far away.”
Though the piloted system aims to avoid pedestrians and most everything else, Audi realizes that collisions can’t always be prevented.
“If we can’t avoid an accident, we steer to use the structure of the car to minimize the chance of injury,” Rudolph said.
Such an action would occur mainly when the human driver didn’t take over in time to avoid a collision. Audi uses an LED alert system to tell drivers when they need to take charge. They can do that by hitting the brakes or making a sharp steering wheel movement. An internal-looking camera watches drivers so the system knows whether the LED alert needs to be augmented with an audible warning.
“In the piloted driving mode, we may need to get the driver back, so we need to know what he’s doing,” Rudolph said.
Top Stories
INSIDERManufacturing & Prototyping
How Airbus is Using w-DED to 3D Print Larger Titanium Airplane Parts
INSIDERManned Systems
FAA to Replace Aging Network of Ground-Based Radars
NewsTransportation
CES 2026: Bosch is Ready to Bring AI to Your (Likely ICE-powered) Vehicle
NewsSoftware
Accelerating Down the Road to Autonomy
EditorialDesign
DarkSky One Wants to Make the World a Darker Place
INSIDERMaterials
Can This Self-Healing Composite Make Airplane and Spacecraft Components Last...
Webcasts
Defense
How Sift's Unified Observability Platform Accelerates Drone Innovation
Automotive
E/E Architecture Redefined: Building Smarter, Safer, and Scalable...
Power
Hydrogen Engines Are Heating Up for Heavy Duty
Electronics & Computers
Advantages of Smart Power Distribution Unit Design for Automotive...
Unmanned Systems
Quiet, Please: NVH Improvement Opportunities in the Early Design...



