New Mobility’s Mega-Mappers

Here Technologies’ maps contain detailed roadway information crucial to precision automated navigating. (Here Technologies)

A yellow sign on a mountain highway shows an S-shaped curve. This is a primitive map, and hardly a faithful representation of the road. Instead it delivers a simple signal to the driver: Get ready for turns.

Road cartography has evolved over centuries with a unifying purpose: to guide human beings from point A to point B. Complexity often gets in the way. “You don’t want too much detail,” says Wei Luo, formerly a product manager at Google Maps and now chief operating officer at Deepmaps, a Palo Alto, California-based startup. “That can confuse people.”

At the same time, though, the cartographer counts on the map’s user to fill in many of the missing pieces—and respond to changes. After all, the user is a fellow human being. Maps, like language, are symbols that bridge human minds.

Deepmap’s data-point “view” of the world. (Deepmap)

New-age cartography for autonomy

But this is changing. The newest field of cartography—creating maps for autonomous vehicles—is designed for a different user: a software program. Unlike a person, the navigation program demands specifics—every squiggle, every raised curb, every passing lane, all of them calibrated by the centimeter. At the same time, and far more challenging, automated navigation must adapt to immediate unknowns. How should it provide guidance to the destination if a fallen tree lies in its path? While a human driver might swear under her breath and improvise, most software programs will require detailed guidance.

An entire industry is rising up to create this new breed of map, a fundamental technology for the nascent autonomous industry. After all, the purpose of the vehicle is to reach a destination. The map tells where it is and how to get there, the AV’s connection to the physical world.

Creating these maps requires precise three-dimensional recording of every street and byway—itself no mean feat. But it also requires muscular layers of artificial intelligence (AI) to interpret what it encounters along the way and then to respond appropriately. Often within a fraction of a second.

Map-developer Here Technologies envisions a variety of useful near-term services, such a guidance about parking availability, springing from its rich map data. (Here Technologies)

It’s a massive undertaking that feeds this growing field of research. Google’s Waymo, the industry’s AI behemoth, is developing maps for its autonomous fleets. It’s joined by a host of start-ups, including venture-funded DeepMind and Carmera in the U.S. and European-led Here Technologies, which is backed by Daimler, Volkswagen and other automakers. The winners in this market will be positioned to run the world's geo-platforms, tracking and guiding much of the movement on our planet. “It’s a very hot field for research,” says John Dolan, a robotics professor at Carnegie Mellon University.

Dealing with change

A central challenge for autonomy-centric mapping is adapting to change. “The system actually has to be 4D, says Deepmap’s Luo. “That’s 3D plus time.” To incorporate time into the map, each system must devise a method for harvesting reliable, up-to-the-minute data. Some, like Waymo, use the sensors on their own fleets of AVs. Others look to crowdsourced data or piggyback on the onboard LIDAR and other sensors.

City streetscape represented by Here’s HD mapping software. (Here Technologies)

Once the sensors are in place and sending back streams of reports, the data-gathering part of job is straightforward. “You start with a very rich base map,” says Ro Gupta, founder and CEO of the New York start-up, Carmera. “That’s not trivial,” he says, “but it’s somewhat a solved problem.”

It’s the flood of data itself that creates immense challenges. Each AV, says Luo, generates about one petabyte per hour of navigational data. Software must sift through this avalanche of data to find the fragments that are meaningful and then “decide” whether take action. This is an enormous cognitive enterprise—and requires strong doses of AI.

The initial challenge is simply to spot a change. As the data pours in, the base map is certifying that everything is matching. Stop sign? Check. Left-turn lane? Check.

Then it encounters something new: A white space at a street corner where there used to be a pine tree. The system notes a change. But is it more significant than other changes, like falling leaves or fresh puddles? A human being might immediately recognize the white space as a parked truck, and not give it a second thought. The software, however, lacking human experience and intuition, must probe for clues. Is there more data to corroborate the observation? How many times have objects, like a tree, gone missing before? Is there any correlation in such cases to accidents or other troubles? Is traffic continuing unimpeded?

In responding to changes, time is of the essence. One logical approach would be to reduce data flows and associated latency by programming the sensor vehicles to report only when they detect changes from the base map. If the traffic is flowing on the usual three lanes on Broad Street, why add to system “noise” by reporting it? The trouble, though, says Carmera’s Gupta, is that unperceived changes will be missed. “You lose the false negatives,” he says.

Cloud or no cloud?

Updating this new variety of map raises all manner of issues regarding data management. How much of the geo-data, for example, should the vehicle itself interpret and what proportion should be uploaded to cloud-based AI systems?

On one hand, the cloud can harvest from multiple sources, match them with historical patterns, and provide expanded intelligence. But even with ultra-speedy 5G cellular networks expected to be widespread within three years, the back-and-forth of data transfer raises latency concerns. What’s more, since network connections are never guaranteed, autonomous vehicles must be equipped to interpret deviations from the base map for themselves and respond appropriately.

In these early days, most of the mapping companies are focusing on small samples of the earth’s roadways. Naturally, many concentrate on the areas where autonomous driving tests and services are underway. Waymo and Deepmap, for example, are busy in parts of Arizona and California. Carmera, which has agreements with companies that operate delivery fleets, is modeling New York City, San Francisco and retirement villages in Florida, where its partner, Voyage, is operating autonomous shuttle services. The exception is Here Technologies, which is harvesting anonymized data throughout much of Europe and North America from sensors on hundreds of thousands of vehicles manufactured by European automakers.

Monetization matters

One problem, particularly for the venture-backed startups, involves timing. While they’re making large investments now, the widespread use of fully-automated vehicles (SAE Level 4 and 5) may be a decade away, or perhaps longer. In the meantime, they’re searching for intermediate markets for their next-generation maps. “With this transition taking place, how can we use this data to help the driver [now]?” asks Matthew Preyss, a product marketing manager at Here Technologies.

Preyss suggests the new maps will enhance current navigation services, like Waze, Google Maps and TomTom, with more up-to-date road status and course corrections. But the maps could also feed new services, such as augmented reality and parking availability, providing detailed information on the route in both audio and video. The challenge, as always when it comes to maps and human beings, will be to provide helpful data while culling distracting detail.

However, keeping humans in the loop during this period of development also has an advantage: the maps themselves can learn from the drivers’ responses to the data—and focus the AI on significant changes along the route—the ones that demand a response. In this way, we human drivers, over the next decade, will be “educating” the navigation engines poised to replace us.