When Autonomy Underperforms: the Evolving Liability Model

As autonomy-related accidents expand, expect legal liability to shift to products rather than people.

Scrutiny of public-road testing of autonomous vehicles reached a new intensity after a March 2018 incident in which an Uber-owned test vehicle struck and killed a pedestrian in Tempe, Arizona. (Image: ABC15 via YouTube)

When humans drive a vehicle, safety liability issues are well-established and depend largely on whether a state has adopted no-fault liability or whether a product was at the root of the issue. When human drivers are at fault, either no-fault liability or traditional negligence liability usually controls.

But what happens when the vehicle is the “driver?”

The answer: product liability

As fatalities and injuries related to automated driving arise, plaintiffs’ lawyers likely will switch from traditional negligence suits to product-liability suits, with the following options:

Negligence-based product liability: The plaintiff must establish that a manufacturer breached its duty to use reasonable or ordinary care under the circumstances in the planning and/or designing of the product so that the product was reasonably safe for its intended purpose.

Design defect: The plaintiff must prove that the manufacturer’s process or decisions did not properly weigh alternatives and evaluate tradeoffs to develop a reasonably safe product.

Implied warranty: The plaintiff must prove injury caused by a defect in the product—attributable to manufacture — that made it not reasonably fit for its intended, anticipated or reasonably foreseeable uses.

Manufacturing defect: A faulty manufacturing process delivers a product that does not meet the specification or standards as dictated for that component or vehicle.

Duty to warn: Manufacturers that know or ought to know of a danger inherent in a product, or in the use for which the product is intended, have a duty to give adequate warnings about that product.

So, when the vehicle is the driver, how does the analysis change?

Here come the plaintiff’s attorneys

May 2016 Tesla Model S Fatality

This fatal accident occurred while the vehicle’s driver engaged Tesla’s “Autopilot” driver-assist functions. Root-cause analysis showed that the Autopilot sensor failed to distinguish a white tractor-trailer crossing a divided highway against the backdrop of a bright sky. The Tesla vehicle struck the tractor-trailer as it crossed an intersection that was not managed by a traffic signal.

Following the accident, the National Highway Traffic Safety Admin. (NHTSA) initiated a Preliminary Evaluation (PE 16-007) to determine whether the system acted according to its design parameters and expectation — or if the system experienced a defect. On January 19, 2017, NHTSA closed the file, finding that a “safety-related defect trend has not been identified at this time and further examination of this issue does not appear to be warranted.”

December 7, 2017 Cruise Automation Chevrolet Bolt

In this incident, a motorcyclist claimed the Bolt initiated a lane change then abruptly returned to its previous lane, now occupied by the motorcyclist. In contrast, GM’s report to the California DMV stated the Bolt returned to its lane when a nearby vehicle decelerated, leaving an insufficient gap to make the originally-intended lane change. As it returned to the initial lane, the motorcyclist “wobbled and fell over” when attempting to perform a lane split (riding between two lanes, which is legal in California).

On January 22, 2018, the motorcyclist filed a personal-injury lawsuit against GM alleging traditional negligence and suggesting that GM owes other drivers on the road a duty of care “in having its Self-Driving Vehicle operate in a manner in which it obeys the traffic laws and regulations.” The plaintiff contends GM “breached that duty in that its Self-Driving Vehicle drove in such a negligent manner that it veered into an adjacent lane of traffic without regard for a passing motorist, striking Mr. Nilsson and knocking him to the ground.”

March 18, 2018 Uber / Volvo XC90 Fatality

In March, an Uber-owned Volvo XC90 involved in testing struck and killed a woman in Tempe, Arizona. The vehicle was in fully-autonomous mode when it struck the woman as she walked her bike across the street. As of late April, the root cause had yet to be established, but reporting suggested that Uber “disabled the standard collision - avoidance technology in the Volvo SUV.” Commentators questioned whether a sensor may have contributed to the failed detection or whether algorithms are implicated. The matter was settled less than a week after the woman’s family retained an attorney.

March 23, 2018 Tesla Model X Fatality

Also in March, a Tesla Model X in Autopilot mode was involved in a fatal crash and vehicle fire. Tesla, now removed from the NTSB investigation, placed the blame with the driver and a road divider. The driver’s family retained an attorney.

Design decisions that bolster product safety

One of the largest emerging risks is the potential exposure from a design defect. Future litigants will likely scrutinize the test protocols, algorithms and engineering decisions used in the development of systems that make substantial safety decisions for users. Key to mitigating potential claims is adherence to a design and test plan that addresses these alternatives and tradeoffs — while incorporating legislative, executive and judicial constraints found in rules, regulations, laws and case law.

One way to assess these alternatives is for AV developers to perform more robust assessment of potential risks and failure modes. To incorporate the principles of product liability into a robust DFMEA, these points should be considered:

  • What alternative designs could achieve the same function?
  • Assess how differences in conditions (weather, night, gender, age, etc.) might impact the use of the product.
  • Review potentially relevant standards.
  • Reflect on newsworthy cases that provide lessons learned, particularly if issues resulted in legal challenges.

In sum, these concepts should be cascaded into a robust test plan to validate the system with a competent model—one that tests the product for all demographic and environmental variables.



Magazine cover
Autonomous Vehicle Engineering Magazine

This article first appeared in the May, 2018 issue of Autonomous Vehicle Engineering Magazine (Vol. 5 No. 5).

Read more articles from this issue here.

Read more articles from the archives here.