Autonomy Testing: Simulation to the Rescue

Thomas Kil, a Mobis North America simulation and data engineer, does simulation analysis work as part of the ongoing development and testing of DDREM.

Engineers like a good challenge, and the development path for autonomous technologies is putting logical and creative thinkers to the ultimate test.

Simulated environments are helping Mobis engineers and researchers with the design and testing of DDREM. Zachary Obsniuk, a Mobis North America intern is seated at a simulation environment workstation.

“Use case and endurance testing—although highly relevant—are not enough to get to the confidence level that’s needed before going into production with a highly-complex automated driving system. When you’re trying to hit a metric that’s 500 million to a billion miles of accident-free driving, you just can’t drive that many miles, even in a decade. Challenges haven’t stopped engineering progress in the past, and it isn’t stopping progress now,” David Agnew, Director Advanced Engineering, Autonomous Vehicle Research for Mobis North America, asserted in a recent interview with Automotive Engineering.

Automated mobility for Mobis

The first novel autonomy technology from Mobis North America’s advanced engineering group, formed in late 2015, will be a safety-specific system known as the Departed Driver Rescue and Exit Maneuver (DDREM). The system is being developed to a one-in-50,000-mile accident-free metric: “That’s four orders of magnitude easier than doing a full autonomous-vehicle metric. We think that it’s a small enough problem that Mobis and out-of-the-box thinking can solve,” said Agnew.

Mobis North America's David Agnew.
Metamoto's Chad Partridge.

A production-ready DDREM system is expected in the 2020-2021 timeframe. The system will detect when a driver is no longer operating the vehicle, then initiate SAE Level 4 autonomous control to guide the car to a safe stop. A driver who has fallen asleep will trigger system activation; engineers are also considering other system activation triggers, such as a heart attack or other severe medical emergency.

Currently, 18 Mobis engineers, robotic, and computer science specialists working at the firm’s Plymouth, Michigan, facility are tasked with developing DDREM. In the coming months, the team will be expanded to approximately 25 technology experts.

“There is a definite overlap of work between our two advanced engineering teams. The design team is working to develop DDREM to our performance requirements, while the testing team is working to figure out how we’re going to analyze, validate, and build confidence that the design will meet the performance requirements,” said Agnew.

DDREM’s development effectively underscores the challenges associated with testing and validating a highly-complex autonomous system.

“To put a car into production right now, it’s pretty straightforward in terms of testing,” said Agnew, noting that use case physical testing is standard practice across the industry. “You verify product designs for seatbelts, transmissions, lighting and other systems with specific tests. But all of the tests for those in-production systems have already been verified. That’s just not the case with autonomous technologies.”

Progress in developing self-driving vehicles has been substantial since the Defense Advanced Research Projects Agency (DARPA) first offered a $1 million prize in its 2004 challenge. Although no team created an autonomous vehicle capable of completing the required 150-mile trek through the Mojave Desert, that event was a touchstone for focused autonomous-system development and testing.

These Metamoto test-suite results are from batched simulations of a parameterized 'vehicle runs red light' scenario (image: Metamoto).

The scorecard of current development projects includes many programs with closed-circuit and public-road testing. In the U.S., Ford is upping the number of Fusion Hybrid sedans in its autonomous test fleet and plans to start producing SAE Level 4 self-driving vehicles in 2021, while General Motors will increase its autonomous-driving test fleet of Chevrolet Bolt electric cars from 50 to 180.

Honda recently announced plans for highly-automated, SAE Level 3 expressway-driving capability by 2020—with SAE Level 4 capability targeted for 2025. Toyota’s Collaborative Safety Research Center recently launched 11 research projects for autonomous and connected-vehicle technologies. Meanwhile, BMW, FCA, Intel and machine-vision specialist Mobileye earlier this year announced a cooperative program to develop a scaleable SAE Level 3 to Level 4/5 platform. Other automakers and suppliers are also in advanced autonomous-development programs.

Simulation and brainpower

Consider just one type of testing conducted on advanced driver-assistance systems (ADAS), the forerunner to integrating these technologies for autonomous operation. Current use case tests for ADAS verify certain actions, such as the driver receiving a lane-departure warning when the vehicle departs from the current traffic lane.

Jay Gromaski, test and analysis manager for Mobis Autonomous Vehicle Research Group, evaluates the base platform of the A646 simulator being built by Cruden in the Netherlands to Mobis specifications.

“With artificial intelligence (AI) or systems with a high degree of complexity, it’s a completely different game,” said Agnew. Autonomous-driving scenarios can be impacted by seemingly small variations. For instance, an autonomously-driven vehicle’s sensor system can detect visibly moving legs on a pedestrian dummy. But the scenario changes if the pedestrian dummy’s legs are covered by an overcoat and that clothing confounds sensor detection. The bottom line is the developers of an autonomous systems must strive to design a system capable of recognizing and correctly responding to all possible driving scenarios.

“The auto industry works in a use-case environment, but we’re in new territory with autonomous technologies. While use-case testing still has a vital role, we also need to adapt how we test,” said Agnew, stressing the need “to predict as much as we can about how well the autonomous system is going to work.”

An ability to predict outcomes amid hundreds of thousands of variables is the primary reason why simulation environments are emerging to assume major role in autonomous technology development and testing.

This A646 motion simulator currently is being constructed in Amsterdam, Netherlands, by Cruden for delivery to Mobis North America in Plymouth, MI.

“Simulation environments will be absolutely crucial in validating the performance and consistency of automated driving systems, especially the AI components of the software stack, such as the neural nets,” according to Sam Abuelsamid, Senior Analyst-Energy for Navigant Research in Detroit.

Help with 'corner cases'

Like many other companies developing automated-driving technology, Mobis views collaboration as an essential. In the near-term, Agnew expects to ink a contract with Metamoto, a provider of purpose-built scaleable simulation. Metamoto’s simulation offering will enable engineers to conduct millions of daily tests on autonomous systems, “while intelligently exploring parameter spaces and performance boundaries across relevant edge cases,” according to Chad Partridge, CEO of the Silicon Valley startup firm.

Simulation software is vital for ironing-out the "corner cases"—situations that are nearly impossible to plan for but could happen. If engineers want to alter the autonomous system, such as the AI design or the sensor architecture, the simulation environment is a non-risk way to evaluate the effect of a change prior to targeted physical testing.

“Simulation software also can be used to validate numerous different aspects of the technology much faster and less expensively than real-world testing,” according to Michael Ramsey, Research Director of Gartner, Inc., a Stamford, Connecticut-headquartered research and advisory company.

Simulation on the move

Simulation environments provide a two-dimensional interface and are especially helpful in determining worse-case scenarios, but Mobis engineers and research volunteers will experience an immersive environment of replicated roadways and vehicle movements inside a new motion simulator being custom-built by Cruden of the Netherlands. The 6-degrees-of-freedom motion simulator is expected to be operational at Mobis’ Michigan facility in April 2018. “We didn’t feel that our static simulation workstations could provide us with comprehensive human machine interface (HMI) data,” Agnew explained.

Cameras and sensors inside the motion simulator will enable engineers to observe and collect data of a driver falling asleep at the wheel. This information will compliment in-vehicle video recordings from university research programs as well as other studies on drowsy drivers. “We’re interested in what happens to the accelerator pedal and other in-vehicle controls while the driver is asleep as well as what can happen if the driver wakes up and attempts to regain control of the vehicle after DDREM activation,” said Agnew.

While workstation simulation environments will provide a framework for Mobis engineers to develop the DDREM system and its operational algorithms, the motion simulator will generate data to be looped back to simulation as well as provide information that will be applied to endurance and use-case tests.

Mobis North America also will conduct tests on a demonstrator autonomous vehicle built in-house by Mobis technicians and fabricators. This vehicle is a duplicate of a prototype autonomous vehicle built by Mobis specialists in South Korea. “We expect to have our first prototype autonomous vehicle in the U.S. for DDREM development by the end of this year [2017],” said Agnew.

Phase-one track testing with the prototype demonstrator will involve autonomous lane shifts until the car detects the edge of the roadway and performs a safe stop. “Developing the autonomous capability will be easy in comparison to understanding fully how often the self-driving maneuvers will work and how often those maneuvers will fail,” Agnew said.