Dial ‘M’ for Mobility: Smartphone as Steering Wheel

Tesla’s controversial new app-based Smart Summon feature raises plenty of legal-liability and regulatory questions.

Like its remote start function, Tesla’s new Smart Summon feature uses a smartphone app to provide valet-like ability to owners. (Tesla)

For those with a passion for automated-vehicle innovations, late September and early October meant a trip to Twitter and YouTube, which were awash with videos from owners, detractors and supporters showcasing Tesla’s newest driver-assistance feature: Smart Summon. In September, Tesla issued its latest over-the-air software update, Version 10.0. The most controversial aspect of the new software release was Smart Summon, a feature that allows the owner to remotely pilot a parked vehicle to their location with the power of their finger.

On its website, Tesla describes Smart Summon as allowing “your car to drive to you or a location of your choosing, maneuvering around and stopping for objects as necessary. Smart Summon works with your Tesla app and your phone’s GPS to operate. You must be within approximately 200 feet (61 meters) of your car. Like Summon, Smart Summon is only intended for use in private parking lots and driveways. You are still responsible for your car and must monitor it and its surroundings at all times and be within your line of sight because it may not detect all obstacles. Be especially careful around quick-moving people, bicycles and cars.”

In essence, an owner sets a destination that is within 200 feet of the vehicle. They launch a smartphone app (left) and hold their finger on the screen as their vehicle drives towards them or the destination. The vehicle will stop if, at any point in the transit, the owner removes his or her finger from the screen.

Making a viral masterpiece

It took minutes for the first Smart Summon videos made their way to social media. This isn’t surprising, considering that Elon Musk reported that there were more than 550,000 Smart Summon uses in the first few days of launch. Over the coming weeks, drivers posted videos showing the full benefits of the feature. At the same time, the universe was ready to catch as many less-than-optimal moments as possible. These examples included videos titled “front bumper damage,” “ran into the side of [a] garage” and “didn’t go well.”

Many of these videos show the owners operating the system in a manner contrary to the directions and instructions given by the company. Others show use in significantly congested areas, including a Walmart parking lot, where interruption by startled pedestrians may seem humorous to the videographer, but makes sense when considering the public impact of this unprecedented application of sensor technology.

And after the events unfold, the lawyers can clean up the details. As the number of videos increase and the property-damage claims mount, the hunt for the “person responsible” begins. The inevitable question of who is liable is asked alongside the questions below surrounding regulations and agency actions:

Am I permitted to use Smart Summon in my state?

As of early October, no jurisdiction had moved to bar the use of the feature. In fact, the California Department of Motor Vehicles has determined that the Smart Summon system does not constitute autonomous technology, as the car is being operated by a person. Subsequently, California compares it to the Autopilot feature and therefore decrees that Tesla is not required to seek special permits.

What if I’m not following the instructions?

There clearly is a risk of misuse with this technology. In a Forbes article, David Zuby, chief research officer for the Insurance Institute for Highway Safety, stated “One must suspect that the system is not reliably safe or the need for human supervision wouldn’t be necessary. The implied unreliability is the ‘troubling’ aspect of this feature because there’s already evidence that some people will not monitor the vehicle’s progress.” A failure to follow the instructions will clearly have implications in any potential claim made while using the Smart Summon feature.

What does NHTSA have to say about this?

The National Highway Traffic Safety Administration (NHTSA) issued a statement acknowledging that it is aware that Smart Summon doesn’t always function according to user expectations. In the statement, the agency noted that it is in contact with Tesla but has not opened a formal investigation. It also assured the public that the “agency will not hesitate to act if it finds evidence of a safety-related defect.”

Have other product safety groups weighed in?

With any feature, consumer-product safety organizations and related concerned parties are ready to react. Unsurprisingly, Consumer Reports has been highly critical of the release of Smart Summon. Senior policy analyst Ethan Douglas stated, “Tesla once again is promising ‘full self-driving’ but delivering far less, and now we’re seeing collisions. Tesla should stop beta-testing its cars on the general public by pushing out experimental features before they’re ready.”

Views for reasonable takeaways

All this raises a singular question: Should we be able to drive a vehicle by applying one finger on a smart device? From the legal and design perspective, that question should prompt a few actions and thoughts as we continue to break new ground in the field of autonomy:

Recognize foreseeable misuse. After watching a few “Smart Summon Goes Wild” videos, it’s highly likely you will be able to create a Pareto chart of potential misuses. In development, however, it will become more important to identify these misuses before launch. After identification, an honest assessment of the severity and likelihood of occurrence should drive or at least influence design decisions.

Customer education is critical. One way to address misuses – once they’re identified and analyzed – is to enhance customer education. A simple read of the Tesla update delivers the message in a plainly clear manner: “You are still responsible for your car and must monitor it.” Yet consumers continue to fail to heed the warnings and operate the technology outside the intended scope. As we draft the language of warnings and educational components, we must consider different learning styles (auditory, visual, tactile and kinesthetic learners) – and reinforcement plays into safer operation of products.

Definitions matter. This product provides an excellent opportunity for those in the legal and design community to confer about what a truly “autonomous” system means. We are able to use this to educate the community while assuring we stay within the bounds of state and federal regulations.

Features such as Smart Summon will continue to elevate the discussion on what it means to be truly autonomous and how we will ultimately launch these features. With these thoughts in mind, we will be ready to meet the public challenge.

A self-described “recovering engineer” with 15 years of experience in automotive design and quality, Jennifer Dukarski is a Shareholder at Butzel Long, where she focuses her legal practice at the intersection of technology and communications, with an emphasis on emerging and disruptive issues that include cybersecurity and privacy, infotainment, vehicle safety and connected and autonomous vehicles.