Nodar's GridDetect: Look, Ma, No Lidar!
Nodar GridDetect skips lidar but can still spot an action figure in the road 500 feet ahead.
The streets of Munich look different when seen through a Nodar point cloud created by a set of stereo cameras. Nodar’s Hammerhead technology uses two standard, automotive-grade CMOS cameras connected to each other like human eyes, but the output is much more than a high-tech Viewmaster.
During IAA 2023, Nodar provided test rides through the city’s crowded streets to showcase a prototype Hammerhead system displaying live images of the world in front of the vehicle measured by distance. Being able to build a live, 3D point cloud like this is not new, but doing it with two off-the-shelf cameras that can be positioned anywhere on the vehicle and algorithms that accurately measure distance is, especially without a lidar sensor on board, unusual.
Traditional two-camera systems use stiff metal beams to hold the cameras exactly in place, and the lenses are positioned close together. Sensing range and accuracy are proportional to the distance between the cameras, so the further apart you place the cameras on the vehicle, the further the system can see. Nodar CEO Leif Jiang said the company is talking to OEMs and Tier Ones about installing the cameras – currently, five-megapixel Sony IMX490s, but any automotive-grade camera could be used, including RGB, infrared or long-wave IR – in the side mirrors, the headliner or the roof, behind the windscreen or in the headlights. Similarly, while Nodar currently works primarily with Nvidia on the compute side, it could just as easily work with Qualcomm Snapdragon or a system from Ambrella or others.
By untethering the cameras and relying on software to align the cameras, Nodar claims its system can now spot a 100 mm (3.9-inch) obstacle at 150 m (492 ft) or an overturned motorcycle at 350 m (1148 ft) using GridDetect, a new capability for Hammerhead that the company introduced at IAA. Longer detection ranges mean more time for automated driving systems to react.
Nodar — an acronym for “native optical distance and ranging” — was founded in 2018 and has so far raised $14.5 million from groups like New Enterprise Associates and Rapsody Venture Partners. The company currently has around 20 employees with more than its fair share of engineers.
“Even the business development folks, they all have engineering degrees,” Jiang said.
Measuring for real
A background in AI is not required to work at Nodar since the Hammerhead system eschews the AI-based approach used by other object detection systems in favor of a signal processing approach that is deterministic, testable, repeatable and, Jiang said, “super fast.” Those are some of the reasons people love lidar, he said, because lidar sends out a pulse of light and measures the physical return time to calculate the distance to every pixel.
“That’s hard to fake, right?” Jiang said. “It’s like a tape measure, almost. What we’re trying to do is repeat a physical process using triangulation. From two cameras, we’re physically measuring the angle and we’re physically determining the distance. There’s no interpretation. The important differentiation is the learning aspect. [Others] have imperfect information and they’re trying to somehow teach a machine to guess what’s there. That’s not what we’re doing. The neat thing is that we know how the system is going to behave, even if it sees an image it’s never seen before.”
Riding in the test vehicle in Munich, I could see dark spots in Nodar’s point cloud called stereo shadows. These are the result of two cameras looking at the same thing when the view is occluded. Instead of trying to determine the distance, the system just leaves it blank.
“We don’t want to report pixels that are not seen by both cameras,” Jiang said. “That’s core to safety. We don’t want to make up data. We don’t want to do what is called inferencing in the AI world, creating depth when you really don’t have a measurement for it.”
As capable as Nodar claims Hammerhead is, it is not good enough to be used alone. Radar, which Nodar doesn’t do, should be part of a vehicle’s sensing package, Nodar COO Brad Rosen told SAE Media. Just how Hammerhead might fit into an ADAS or automated driving system on a commercial vehicle that includes lidar is easier to envision than having both Hammerhead and lidar on a passenger vehicle, Rosen said. Nodar’s system is also much cheaper, Rosen said, and takes significantly less computing than running networks all over the car that, for example, Tesla does.
GridDetect to the rescue
Nodar is not interested in building a complete AV system. Instead, it provides point clouds and data to an ADAS or automated driving system built by other companies. It is these systems that will then decide how to react. After hearing from potential customers that the Hammerhead system was simply too much for them to figure out how to process – it can measure 5 million points in every single frame, and runs at 15 frames a second to generate 75 million points per second – Nodar created GridDetect to help those systems.
GridDetect is an object detection layer that leverages the information density Hammerhead produces by picking out and putting bounding boxes around all the objects in a scene, along with their position and velocity. GridDetect is what allows Hammerhead to understand what’s in the road ahead better than it could before.
“It’s really hard to do road plane estimation and find a little blip above it, that’s actually a very difficult problem because most software assumes that the ground is flat,” Jiang said. “Out at 250 meters (820 ft), I guarantee you it’s not flat. You’ve got roll, pitch, yaw of the road. And so we have a very accurate way of modeling the ground and finding little things sticking above it.”
Hammerhead’s detection capabilities exceed those of even 1550-nm, long-range lidar systems that “sort of peter out around 50-60 meters (164-196 ft) on the road surface,” Jiang said. “You might see that first car at about 60 meters and maybe a couple of points on the one at 75 (246 ft), but beyond that, you’re blind. I don’t know if you’ve ever tried to drive by limiting yourself to 75 meters, but it can be very scary when you’re going fast.”
Nodar at night
Hamerhead can also measure more points at night, even using just low-beam headlights, than lidar systems in the same situation, the company said. The trick lies in software that integrates information from the slightly blurred images that come with Hammerhead’s longer exposure times.
“Typically, systems in our field that run stereovision at night might integrate for a millisecond before they run into problems,” Jiang said. “[Our data] is integrated over 10 milliseconds, so we have 10 times more photons that we’re collecting. That’s kind of our secret on the algorithm side to be able to see farther into the night.”
Nodar’s low-light efforts are getting an assist from factors outside the company’s control, including a drop in computing cost and an improvement in CMOS sensors. New adaptive beamforming headlights, approved for U.S. use in 2022, can light up the road almost twice as far away as previous headlights. Future Nodar systems could also benefit from IR illuminators. And, since Nodar’s system is hardware flexible, the company can push a software change in weeks instead of needing a long development cycle to integrate new components. Even as it stands, the “[Nodar] system is 30 times more effective at night than lidar,” Rosen said. “We’re getting two to four times the range and 30 times the amount of data as compared to the lidar at, roughly, five to 10 times lower cost.”
A complete Hammerhead system is likely to cost an OEM in the “low hundreds of dollars,” Rosen said. “That’s for everything: the lenses, the cameras, the cabling, the ECU.”
Those low costs, and indeed the whole technology behind Nodar, would not have been possible until recently.
“I like to think of it like a perfect storm,” Rosen said. “The cameras, the algorithms, the compute, and the headlights have all converged, and this is why we couldn’t do this five or 10 years ago. The opportunity has really opened up, and that’s what we’re trying to seize upon.”
Today, many companies trying to adapt their systems from L2 to L3 and beyond often add lidar to the sensor stack, but lidar programs are having trouble delivering on time or meeting requirements, Nodar said, and these companies are now looking for other options.
“There’s been a big shift [away] from L4, which everyone started to realize is very, very complicated and costly,” Rosen said. “There’s a big inflection point at L3. That’s where we think
the Nodar rubber hits the road. We think there could be as many as a quarter billion L3 vehicles on the road in 10 years.”