Tesla’s FSD Recall Impacts AV Industry

The automaker’s recall of its Full Self Driving Beta leaves a significant dent in automated driving’s credibility.

Apart from announcing Tesla’s voluntary recall of its Full Self-Driving Beta software, the NHTSA also is investigating a spate of crashes involving Teslas using the company’s standard Autopilot driver-assistance system. (YouTube)

On February 16, 2023, the National Highway Traffic Safety Administration announced that Tesla had voluntarily agreed to recall 362,758 Model S, Model X, Model 3 and Model Y vehicles – the entire parc of Tesla models fitted with the beta version of the company’s Full Self-Driving (FSD) Beta software. The NHTSA cited FSD’s failure to safely operate Tesla vehicles in a variety of common driving situations — while many industry sources contended the recall was proof Tesla no longer could stay one step ahead of the sheriff regarding its insinuations about FSD’s capabilities.

Automated shuttle developer Cruise Automation, which does not deploy its automated-driving software for vehicles sold to the public, said in late 2022 that its driverless vehicles had logged 500,000 miles “without any major incidents.” (Cruise Automation)

The tension about Tesla’s automated-driving features had been building. In the summer of 2021, the NHTSA started investigating several crashes in which Teslas operating with the Autopilot ADAS system (standard on all models) struck parked emergency vehicles. A subsequent NHTSA report said there were 273 Autopilot-related crashes that year. Full Self-Driving — and its “Autosteer on city streets” function — was implied to be an even more-advanced system; FSD’s $15,000 price seemingly supported that inference.

In the wake of Tesla’s agreement with NHTSA to initiate a recall, the company created a support web page that specifically called out FSD as a system with SAE Level 2 driver-assist capability. “FSD Beta is an SAE Level 2 driver support feature that can provide steering and braking/acceleration support to the driver under certain operating limitations,” according to the support page. “With FSD Beta, as with all SAE Level 2 driver support features, the driver is responsible for operation of the vehicle whenever the feature is engaged and must constantly supervise the feature and intervene (e.g., steer, brake or accelerate) as needed to maintain safe operation of the vehicle.”

This despite many who paid for FSD Beta seemingly believed the system offered a higher level of automation. Copious Internet videos of blatant misuse of the system demonstrate that, if nothing else, Tesla communications about FSD to customers and purchase intenders have been cloudy.

Results released in March 2023 from a AAA survey of 1000 adults showed them markedly more distrustful and misunderstanding of AVs than in previous years. In late February, a Tesla shareholder group initiated a class-action lawsuit alleging the company deliberately misled about FSD’s capabilities and safety risks. In an automated-driving development environment that is consolidating and transitioning, Tesla’s mixed messages about automated driving have potential to further destabilize public perception.

Lessons from Silicon Valley

“Tesla has been playing the classic “fake-it-‘til-you-make-it” card and reality is catching up to them,” Missy Cummings, a professor of mechanical engineering, electrical and computer engineering and computer science at George Mason University and former senior safety advisor at the NHTSA, wrote in an email with SAE Media. Cummings, who Tesla CEO Elon Musk has said has bias against his company, doesn’t believe the NHTSA strong-armed Tesla into the FSD recall. Instead, she contends Tesla was “politely nudged” to take the action.

There’s blame to go around, according to Sam Abuelsamid, emobility principal analyst at Guidehouse Insights and a columnist for SAE Media Group. “NHTSA and the FTC are to blame for allowing Tesla to get this far,” he insisted. “As soon as Tesla started selling something called FSD in 2016, [the NHTSA] should have immediately put a stop to it until Tesla could validate something that matched the branding.” “While slightly less egregious, AutoPilot shouldn’t have been allowed either,” Abuelsamid added.

The past year’s spate of driving-automation developers consolidating, downsizing or completely closing shop, combined with Tesla’s FSD recall, proves that most AV developers simply have been on the wrong path from the start, said Michael DeKort, a former U.S. Dept. of Defense and aerospace engineer and founder and CTO of AV simulation developer Dactle. Almost all AV development has come with a degree of over-promise; Tesla is “just the most egregious offender,” DeKort asserted.

But Abuelsamid noted a fundamental distinction between Tesla and all other AV developers.

“We should separate Tesla from all the other companies developing automated driving systems,” he said, because Tesla is the only one “currently marketing anything to consumers on vehicles they can buy and that they control. They are also the only ones shipping a beta (more accurately an alpha at this stage of maturity) of a safety-critical system to consumers. Everyone else — Waymo, Cruise, Motional, Baidu, etc. — is keeping a tight control on their systems, logging all of the data and doing full-time monitoring of even driverless vehicles. Yes, those vehicles have issues,” Abuelsamid maintained, “but they are being tracked at all times.”

OTA as a ‘fix’

Tesla CEO Elon Musk onstage at the electric automaker’s 2023 Investor Day on March 1. (Tesla)

Tesla intends to address the NHTSA’s concerns with FSD’s functionality with an over-the-air (OTA) software update. But experts argue that a mere software “repair” of the system’s responses to a handful of specific driving situations won’t or can’t address foundational issues with FSD’s deeper, artificial intelligence (AI)-guided algorithms.

“Can [FSD’s recall-related problems] be fixed? Certainly,” said Abuelsamid. “Will they be adequately fixed? That remains to be seen.

“This is one of the challenges of an AI-first (and only) approach as opposed to something more deterministic,” he continued. “AI systems are probabilistic and make judgement calls. And like humans, the AI judgements are sometimes wrong. As we’ve seen with the recent wave of news around AI chatbots like ChatGPT, AI is often confidently wrong, providing a plausible sounding answer that is factually incorrect.”

The same is true of AI vision systems. Tesla’s goal of a generalized automated driving system that can go anywhere “is much more difficult to do if you start imposing guardrails — and runs counter to the entire business model. Nonetheless, they need those guardrails,” he said.

Might an OTA update be mere window dressing to satisfy regulators’ potentially simplistic understanding of the problem’s scope? Cummings isn’t convinced any substantive correction of FSD can be expeditious. DeKort agrees that addressing NHTSA’s issues with FSD is a heavy lift when considering Tesla’s wide scope of consumer-use operating domains.

“The problems can be fixed through an OTA,” Cummings said, “but I think realistically for the specific recall, it would take a couple of years of dedicated effort to make that happen. And Tesla said they fixed the problem of [crashing into stationary] first responders more than a year ago and clearly that is not the case.”

DeKort believes Tesla’s issues with FSD run much deeper than correcting the situations cited by NHTSA. “I'm sure [Tesla will] make some change,” DeKort said. “But like everyone else, it's not remotely possible to build an autonomous [SAE] Level 4 vehicle in the public domain. And it's even worse for Tesla because their sensor suite is so incompetent.”

Robust answers

Alphabet’s Waymo deploys its Waymo Driver automated-driving system in a variety of test vehicles, including Class 8 commercial vehicles. Boasting of millions of miles of accident-free driverless operation, it has experienced sporadic bad outcomes from public-road testing. (Waymo)

Some experts believe Tesla’s situation with FSD can, however, inform better practices – both from developers and regulators. DeKort laments that to date, there remains no universal standard to measure performance of AV behavior and performance, although Europe has a nascent standard.

And “regulators need to grow some backbone where Tesla and any other automaker that wants to follow the same path are concerned,” said Abuelsamid. “They need to immediately step in and stop this sort of misbranding [of AV functionalities] and they should also institute a ban on public beta testing of safety-critical systems. Tesla has been reckless in its deployment and development of FSD, but regulators have been asleep at the wheel for more than six years.”

Over-promise of AV capabilities “is a problem,” said Cummings. “The state of California has made this illegal, which should be a national policy. People are dying because they put too much faith in these systems and such false advertising is a major contributor.”

DeKort, meanwhile, has for years insisted that nearly all AV developers are on the wrong strategic path. No developer “will ever get close” to SAE Level 4 or Level 5 operation using their current strategies, he insisted. He said the answer comes in three areas: embrace and leveraging of artificial general-learning (AGI) intelligence, almost complete emphasis of simulation testing rather than real-world testing on public roads and adopting U.S. DoD/FAA -level simulation practices rather gaming-based simulation engines.