New R&D Boss Details RoboSense’s LiDAR Strategy

In a crowded field, RoboSense's mission is to bring mass-produced LiDAR products to the automated driving space.


LiDAR technology is one of the busiest areas of engineering for self-driving cars, and some people have been working on the technology for over a decade. Dr. Leilei Shinohara (below), for example, started his Ph.D work on LiDAR in 2008, receiving his doctorate from the Karlsruhe Institute of Technology. He is now the VP of R&D for RoboSense after being hired away in January 2019 from Valeo's Scala LiDAR project – claimed to be the world's first laser scanner for automotive volume production.

RoboSense's mission is to bring mass-produced LiDAR products to the automated driving space and in a crowded field, Dr. Shinohara believes RoboSense's somewhat simple approach is the right way to make self-driving cars work. "There are almost 80 companies doing LiDAR, and most of them are very young companies," he said in a recent interview with SAE’s Autonomous Vehicle Engineering.

"RoboSense is also quite young, as the company was founded in 2014. However, compared to most of these young companies, RoboSense does not only focus on new, fancy ideas, but mainly on manufacturing stable technology. This is the reason that you only see RoboSense's mechanical spinning and MEMS-mirror LiDAR."

MEMS are microelectromechanical mirrors, used in RoboSense's LiDAR to move and deflect the sensor's laser beam and thereby scan the environment. RoboSense's promotional materials call these sensors "solid state," but Dr. Shinohara admits that they really aren't.

"The MEMS mirror is called solid-state to distinguish it from a mechanical type micro-mirror device," he said. "The MEMS mirror's fabrication process is non-mechanical using similar techniques like PCB or chip fabrication. Therefore, MEMS LiDAR is categorized as MEMS solid-state LiDAR. But recently, MEMS mirror LiDAR, together with mechanical micro mirror LiDAR have been both categorized as Micro-Mirror LiDAR."

While RoboSense is not focused on those "new, fancy ideas" that can take up energy at other companies, the new MEMS-mirror sensors do require new production strategies to get them into mass market vehicles. “For the new generation of mechanical spinning LiDAR, RoboSense has introduced quite a lot of new patented ideas to improve manufacturing quality and efficiency," Dr. Shinohara said. "For MEMS mirror LiDAR, RoboSense has a unique optical module design to improve manufacturing and has an in-house MEMS mirror design and fabrication ability for an improved MEMS design."

That isn't to say that RoboSense isn't making improvements to the LiDAR products it hopes to sell. At CES 2019, for example, it announced that the field-of-view for its RS-LiDAR-M1 sensor was increased to 120 degrees. Dr. Shinohara said this means the sensor itself is more valuable to an automated vehicle's AI. "A wider FOV means that with a single sensor, you can get more information," he said. "For example, in a scenario with a 120-degree FOV, the scene can be detected much earlier than with a 60-degree FOV. Furthermore, on a highway or side road, with a large FOV, vehicles coming from the side can be detected much earlier.”

RoboSense also received a CES Innovation Award at this year's show for its updated 905nm RS-LiDAR-M1, which it publicly demonstrated there for the first time. Dr. Shinohara said that the 905nm sensor is an improvement over the more-common 1550nm sensors. "There are quite a lot of players focusing on long distance 1550nm LiDAR," he said. "RoboSense also has an advanced development team focused on 1550. However, I do not think the supply chain is ready for 1550nm yet because of 1550nm's power consumption, size and heat problems.

“To achieve long range for 905nm, RoboSense uses specially designed MEMS mirrors and an advanced optical path design to reach 150m with 10 percent reflectivity—which means that a pedestrian can be detected in a range of 150m. In emergency breaking situations, such as a 400ms system reaction time, the AV can drive at speeds over 80 miles per hour. A longer detection range means that the AV will have a longer time to react," he explained.

RoboSense says that its 905nm LiDAR sensors were designed for the mass production of autonomous vehicles and can fully support SAE Level 5 driverless automated driving. For now, though, as for other companies, the first steps for RoboSense's MEMS LiDAR technology is to be tested in L2 and L3 vehicles, Dr. Shinohara said

"RoboSense's M1 LiDAR is targeted to support standard mass production vehicles starting in 2021," he noted. "L2 ADAS vehicles are already available on the market. L3 AD vehicles are technically ready, for example, in the new model Audi A8, however, they still need general public acceptance for the system to be approved for safety, security, and comfort (or user-friendliness). L4 will begin to be deployed by Robo Taxi or Robo Truck operators first,” he added. “Therefore, acceptance of L4 AD passenger vehicles will need the public to first accept L3 vehicles, Robo Taxis, or Robo Trucks before being widespread."

In March, RoboSense announced it tested its 16-beam mechanical LiDAR environment perception system in an autonomous shuttle bus called GACHA in harsh winter conditions. The tests were done in concert with the Finnish autonomous driving company Sensible 4 and the Japanese company MUJI. A small fleet of GACHA buses will be deployed on public roads in Finland in April.