The ‘Framework’ for AV Development
The U.S. Department of Transportation establishes structure to advance collaborative road/AV testing.
Over the last 100 years, the transportation industry has been developed primarily around human operators. For example, roadway signs were designed to be easily seen, read and interpreted by humans. However, as vehicle-automation technologies are evolving, driving responsibilities are beginning to shift from the human operator to the automobile. The industry is rapidly moving toward vehicles equipped with Automated Driving System (ADS) technologies — a but a key concern for this inevitable mass deployment of ADS-equipped vehicles is earning public confidence that ADS technology can safely and reliably operate in a mixed environment of automated and human drivers.
There also is the expectation that the ADS-equipped vehicles perform equal to or better than their human counterparts.
In this vein, ADS developers and Infrastructure Owner-Operators (IOOs) that manage the roadways aspire to safe and efficient operation of automated vehicles. The U.S. Department of Transportation’s (DOT’s) Federal Highway Administration’s (FWHA) Testing and Pilot Design, Development and Evaluation Framework (i.e. “the Framework”) project for ADS-developers and IOOs was launched to assist in fostering collaboration. The goal was to develop a collaborative framework to use in ADS test and pilot programs — the outcomes of which can benefit both ADS and infrastructure entities.
Collaboration, not regulation
The successful operation of ADSs on U.S. roadways is heavily dependent upon the ADS being able to reliably interpret and navigate its surrounding environment. However, there remain many challenges: ADSs navigating complex traffic scenarios, adapting in adverse weather and identifying infrastructure that facilitates the ADS’ navigation, to name a few. Given that ADS developers and roadway stakeholders seem to own diverse but complementary visions of the testing and evaluation needs, the likelihood of safe and efficient deployment is necessarily contingent on productive industry-wide collaboration.
The FHWA Framework for successful testing and evaluation of ADS and roadway features emphasizes collaboration among the ADS and infrastructure organizations (see https://ops.fhwa.dot.gov/publications/fhwahop21012/fhwahop21012.pdf ). The Framework is technology-independent and consists of a broad set of processes and examples that advise without prescribing regulations or policy. Instead, the Framework adopts the notion that validation and testing of ADS technologies and various infrastructure features is essential to paving the way for safe deployment of ADS-equipped vehicles into the road network.
Collaborative testing promises to provide resolution to these issues, resulting in a safe and efficient deployment of ADS. Overall, these efforts will further advance deployment of ADS-equipped vehicles onto roadways throughout the U.S. — with desired outcomes benefiting both ADS and infrastructure entities.
Consider a typical State Department of Transportation (DOT) operations scenario where a busy highway is being resurfaced or expanded. Presently, the procedures and standards outlined by the DOT are designed to maximize the safety of the crew within the work zone, while still ensuring mobility for drivers around the work zone. For instance, one direction of travel is shifted from the left lane over the centerline into a lane typically used for the opposite direction of travel (see Figure 1).
Can anyone be certain that the automated vehicles of tomorrow will be able to operate safely through a work zone? The answer lies in the quality and scope of testing. Both the developers of the ADS and the workers within the DOT have a keen interest in how this testing is administered and evaluated.
FRAMEWORK OVERVIEW
The Framework was developed with extensive engagement and input from both ADS and roadway stakeholders, including automotive OEMs, suppliers, technology companies and state, federal and regional government entities. It is far more likely that their combined perspectives can accomplish these goals more rapidly and with fewer errors than could the respective individual entities.
As shown in Figure 2, the Framework addresses nine overarching themes (described below) that support the four key test phases. The Framework embodies the contextual examples, real-world lessons learned, and various considerations, as they will vary across different tests and disciplines (e.g., ADS developers, IOOs, first responders, fleet operators). Each party has a responsibility to engage in the testing and evaluation process for automated driving systems, and this Framework enables collaborative support toward the shared goals.
Collaboration
Collaborationbetween ADS and IOO stakeholders is critical for successful testing and evaluation. Stakeholder collaboration allows for early detection and resolution of ADS issues related to technical, organizational, and strategic test implementations. Open and frequent interactions lead to improved test outcomes.
As an example, the Pennsylvania DOT (PennDOT) recognized the need to prepare for the mass deployment of ADS across the state and assembled nine partners (Blazina, 2019), and (Paez, 2019) to work on changing infrastructure to support ADS in work zones by using distinctive coatings for lane markings and barrels to aid in detection by the ADS. In addition, the project developed advanced mapping and communications systems for safe ADS navigation at and around 17 different work zones. Prior to testing in active work zones, the partnership conducted validation in virtual environments, then at a test track. PennDOT will have a better understanding of how ADS systems operate in certain conditions, and thus will be better-equipped to fulfill its mandate for keeping drivers safe and informed.
Common Ground
Common Ground refers to creating a common or shared working environment so that ADS and IOO stakeholders fully understand each other. When executing ADS/roadway tests, all parties will have clearly-defined expectations, outcomes and success criteria; these include: common goals and benefits, common terminology and common metrics and measures.
Test Logistics
Test Logistics refers to what to test, how to test, and where to test. This includes development of test scenarios, testing methodologies and the test environment.
Institutional / Organization Issues
Having organizational experts from both the ADS and IOO organizations participate early and throughout the test phases will greatly aid in navigating any difficulties. The Framework adopts the premise that for ADS/roadway testing and evaluation, safety of all road users is the greatest priority. Within this premise, state and local regulatory policies must be developed, requiring that the policymakers be well-informed on relevant topics and kept up to date on developments.
Roles and Responsibilities
In the process of ADS/roadway testing and evaluation, it is important to identify whom from the various organizations needs to participate, what roles within the organizations are needed and when (i.e., in which test phases) they need to participate.
Plans
For a collaborative environment to exist, stakeholders’ participation as part of the test-design plan, data-collection plan and evaluation plan can be beneficial.
One example is Waymo’s First Responder Engagement Plan. The objective of this plan is to, in an emergency scenario, provide first responders with the knowledge they need to safely identify, approach and interact with a vehicle equipped with an ADS.
Sharing Opportunities
Data is a key issue that requires thorough discussions from IOO and ADS stakeholders to avoid challenges (e.g., proprietary data/information, use of data). Resource sharing includes sharing of skills and expertise in addition to sharing of information and existing data.
An example is the Arizona IAM Consortium Collaborative Data Sharing. Arizona’s IAM consortium is leveraging existing infrastructure to collect performance data on public roads.
A New Driver
The new driver of tomorrow will be the vehicle. This Framework is an instrument that provides examples and scenarios to those conducting tests, pilots to prepare for a safe and functional road network with, at first, both human and ADS drivers sharing the road — and then aid in transitioning the network into the ‘new normal.’
Success Factors
The most critical success factors include enhancement of technical maturity, comprehension of ADS, roadway test elements, the process, stakeholder engagement and collaboration and ongoing public communications. The Framework aids in assisting the ADS and IOO participants in defining test success factors within each test phase.
FRAMEWORK APPLICATION
The activities for successful collaborative testing and evaluation can be categorized into four test phases, shown in Figure 3.
Pre-Test Phase
The output of the Pre-Test phase is a clearly-identified problem statement that is based both on the internal needs of the ADS developers and the industry. Program risks – including operational, technical, data, legal and financial — should be identified during this phase.
Test Definition Phase
The objective of the Test Definition phase is to conduct activities which help define the technical and data facets of a collaborative test program. Having clearly-defined success criteria ensures an overall higher test quality.
Test Execution Phase
In the Test Execution phase, both technical and data facets of the collaborative ADS and roadway testing proceed as defined in the Test Plan. The focus is on efficient collection of performance data. After completion, ADS performance data is gathered and reviewed to determine if the system is ready to advance to the next phase. For example, if the performance failure at night is considered critical by the stakeholders, then it is likely new tests will be scheduled after updates are made.
Post-Test Phase
In the Post-Test phase, stakeholders review data insights, store data and discuss lessons learned and evaluate the processes used, which drives future programs. This collaborative effort directly leads to a higher level of confidence among the stakeholders — and by extension the public — that the ADS-equipped vehicles will be safely deployed among existing human drivers.
ADVANCING SUCCESSFUL OUTCOMES
ADS technologies traditionally have been developed and tested with little or no discussion with roadway stakeholders.
However, the Collaborative Research Framework for Automated Driving System Developers and Infrastructure Owners and Operators addresses collaboration from multiple perspectives throughout the test lifespan. It provides examples of where collaboration has yielded successful outcomes, the benefits of collaboration and information on how and when to collaborate.
Successful collaboration among stakeholders will allow for early detection and resolution of ADS and infrastructure issues related to technical, organization and strategic test implementations. Ultimately, this helps focus the efforts of both the ADS development teams and IOO stakeholders working toward the common goal of safe and efficient deployment of autonomous vehicles.
This article was written by the U.S. Dept. of Transportation’s Federal Highway Administration (FHWA).
Top Stories
INSIDERDefense
This Robot Dog Detects Nuclear Material and Chemical Weapons
INSIDERManned Systems
Testing the Viability of Autonomous Laser Welding in Space
INSIDERTest & Measurement
Germany's New Military Surveillance Jet Completes First Flight
NewsUnmanned Systems
The Unusual Machines Approach to Low-Cost Drones and Drone Components
INSIDERSoftware
Accelerating Climate-Compatible Aircraft Design with AI
INSIDERManufacturing & Prototyping
Webcasts
Software
Best Practices for Developing Safe and Secure Modular Software
Power
Designing an HVAC Modeling Workflow for Cabin Energy Management...
Aerospace
Countering the Evolving Challenge of Integrating UAS Into...
Manned Systems
How Pratt & Whitney Uses a Robot to Help Build Jet Engines
Manufacturing & Prototyping
Scaling Manufacturing and Production for 'Data as a Service' Electric Drone
Test & Measurement
A Quick Guide to Multi-Axis Simulation and Component Testing