GM Announces Door-To-Door Ultra Cruise ADAS

GM’s next advanced driver-assist system will leverage radar, cameras and lidar to permit hands-free operation in “95% of driving scenarios.”

At launch in 2023, GM expects Ultra Cruise to function on more than 2 million miles of U.S. and Canadian roads. (GM)

GM has announced the next generation of its hands-free Super Cruise advanced driver-assist system (ADAS), upping the label to “Ultra Cruise.” GM claims the Ultra Cruise system, expected to appear first on Cadillac models in 2023, will ultimately enable hands-free driving on all paved public roads in the U.S. and Canada in “95% of driving scenarios.” At launch next year, the system is expected to cover more than 2 million miles of roads, with the capacity to grow to more than 3.4 million miles.

Describing the system as a “door-to-door hands-free driving experience,” GM claims owners of Ultra Cruise-equipped vehicles will be able to travel hands-free across nearly every paved road, including highways, city and subdivision streets, along with rural routes. GM noted that its system has been developed completely in-house (via collaborating teams based in Israel, the U.S., Canada, and Ireland), and that Ultra Cruise will co-exist within the company’s lineup with Super Cruise, which will become more widely available on mainstream vehicles, with Ultra Cruise reserved for more premium entries.

GM detailed that Ultra Cruise is powered by a 5-nanometer, scalable compute architecture based on its new Ultifi software and Vehicle Intelligence Platform. Ultra Cruise will be able to add features over time through over-the-air (OTA) updates and has added significant capabilities to Super Cruise, including parking in residential driveways; reacting to permanent traffic control devices; following internal navigation routes; supporting close object avoidance, automatic/on-demand lane changes and left-/right-hand turns.

The system’s sensor suite will leverage a combination of cameras and radars, with a lidar module integrated behind the windshield to provide a 3D, 360° statistical representation of the environment surrounding the vehicle and “redundancies in critical areas.” A learning diagnostic system will automatically identify scenarios in need of upgrading, triggering data recordings in vehicles equipped with the service. These recordings will be processed through GM’s data ecosystem to continuously improve the system.

Building on the human-machine interface (HMI) currently used in Super Cruise (first introduced in 2017 on the Cadillac CT6), the illuminated steering wheel and driver-attention camera system will be carried over to Ultra Cruise. No pricing information or whether the system will be a one-time option or subscription based was announced, but the media introduction did include a Q&A session with the chief engineer in GM’s automated driving group responsible for Ultra Cruise, Jason Ditman. An edited version of that exchange follows:

Do you consider Ultra Cruise to be an SAE Level 2 system like Super Cruise?

Ultra Cruise will be an [SAE] Level 2 automated driving system, so it does require the driver to pay attention.

Does Ultra Cruise use the same sensor suite as Super Cruise?

The sensing architecture is all new. There are additional cameras and radars, and we are adding lidar to the vehicle. They’re not carryover sensors. They’re the second and third generation of cameras and radars, and of course the lidar is all new.

How will Ultra Cruise function as a door-to-door system?

I live in Michigan, and when we were commuting, not during COVID, but when we were commuting, I had a 64-mile trip to the [GM] Tech Center [in Warren, Michigan]. As I get to the edge of my driveway, once conditions are met, Ultra Cruise will engage automatically, so the steering wheel light bar will turn green. It will take me about a hundred yards to my first stop sign and it will stop at that stop sign. It will make the left-hand turn onto Grand River Avenue. It will then go to M52, take a right-hand turn for me at the stoplight, go over the railroad tracks automatically, merge onto 96, take 96 all the way to the Mound Road exit. Take me off the Mound Road exit through all those stoplights to the main gate at the proving ground. That is what an Ultra Cruise drive will be.

Can you provide any details on the Ultra Cruise compute platform?

We have a new state-of-the-art compute. Our prototype compute that we’re using in our vehicles that we’re doing software development on now takes up the entire third row in the cargo area of a [GMC] Yukon. For production, we’ve gotten that compute down to the size of two laptop computers stacked on top of one another, and we’ve maintained really good latency by scaling that.

How will Ultra Cruise interface with mapping data compared to Super Cruise?

We do have a map for Ultra Cruise, and it is different from Super Cruise. The Super Cruise map is generated from high-definition lidar scans of interstate roads in the U.S. and Canada. Expanding to what is ultimately 3.4 million miles is financially impractical with HD lidar scans. So we’ve got our own in-house technology that we’ve developed, where we are generating the map, and we have various sources of input data to that map generation technology. That map will then be loaded on every car.

You mention Ultra Cruise covering 95% of driving scenarios. What are the 5% not covered, and how will the driver be notified?

One example of the type of scenario [not covered] is a roundabout. A roundabout is a really complex maneuver. Sometimes they have single lanes and they’re not very complicated. Sometimes there’s multiple lanes. Sometimes there’s up to five or six different roads coming into a roundabout. So that is an example of a situation where we’d ask the driver to come back in, complete the maneuver, and then we’ll do an automatic re-engage.

We will know when these scenarios are coming because of what we’re able to see with the perception system, and the information we have in the map. We’ve developed a new customer notification method using the steering-wheel light bar, similar to what we use for the escalations we have in Super Cruise. We’ve developed a new scenario that we call a ‘non-urgent escalation’ and will ask the driver to take control of the wheel. Once the touch sensors on the wheel have confirmation the driver is in control, the light bar will turn off. The driver will complete the maneuver and then Ultra Cruise will automatically re-engage once we’re beyond that maneuver and all the conditions are met.

When we start, this non-urgent escalation is going to be a function of the speed of the car coming into the maneuver. We want to ensure that whether we’re slowing down or whether the driver is slowing down, there’s adequate time to do a nice, gradual deceleration. The slower you’re going coming into the maneuver, the later the escalation will come up. Confirmation of control is grabbing the steering wheel, and we are carrying over the steering wheel from Super Cruise that has the touch sensors.

Is the accuracy of the maps that you’re making comparable to the maps that you use for Super Cruise?

We do rely on similar map data. However, we have a larger number of sensors that also observe the road. So when we combine the map accuracy with what our sensors see of the road geometry and the road markings, we’re still able to accurately place ourselves and drive the right nominal path.

What adaptations were made to maintain Level 2 automation with the additional functions compared to Super Cruise?

This still requires the driver to be maintaining attention, which is why we’ve carried over the driver monitoring system. However, we do need to modify the driver monitoring system slightly because we’re going to be taking turns. The steering wheel spokes will block the camera, so we have to change the system to account for that. The other thing, as a normal driver when you’re going to take a right- or a left-hand turn, you will look right or left naturally. So we’ve had to adapt that system to also handle a gaze that’s not directly down the front of the road, but it’s looking off to the side, because you’re trying to see where you’re going.

Will the system account for various road conditions?

These are optical sensors outside of the radar. Heavy rain, heavy snow will struggle with any visual system, but that’s why the driver monitoring system and a means of bringing the customer back in, if we need to. If all of a sudden it starts raining and we can’t see, we’ll bring the customer back in. We’ve also built in a nice feature that if in inclement weather, snow or ice, a sensor gets blocked, we’ll notify the driver that sensor is blocked.

We will have mechanisms in place to clean the cameras that are mounted on the car for situations where, let’s say you’re following a car on a wet road and there’s some road grim that’s getting thrown up by that other car. Some of those sensors are in locations where there can be buildup on the lens and we would have cleaning that we’d either prompt the driver to hit a button to clean, or would automatically clean.