Senso-Rama!
Toyota’s formidable Platform 3.0 is all about putting autonomous sensors in their place — further from your view.
For the 2018 iteration of the always-influential Consumer Electronics Show, Toyota Research Institute (TRI) revealed Platform 3.0, a gleaming exemplar of autonomy development’s current breakneck pace. No more roofs with clown-car whirling gizmos and backpack-sized blisters on the fenders: at its barest, Platform 3.0’s mission is to integrate bleeding-edge sensor technology into a vehicle that looks normal.
More range, less weirdness
It doesn’t hurt that Platform 3.0’s “donor” car is the already formidable Lexus LS 600hL, the fullsize luxury car considered in many respects to be a benchmark in its market segment. The LS’s roomy proportions mean externally-mounted sensors don’t draw as much attention, while a surfeit of interior space also permitted at least one clever package solution: the streamlined roof nacelle housing cameras and lidar sensors — Platform 3.0’s signature autonomy-related styling clue — was able to remain comparatively low and unobtrusive because engineers leveraged the cavity normally devoted to the car’s sunroof.
“If you look at the evolution of vehicles that we build at TRI, we’re really excited with Platform 3.0 because we were able to leverage the expertise of the Toyota group of companies,” said Ryan Eustice, TRI vice president of automated driving, in an interview with Autonomous Vehicle Engineering.
“We were able to work with Calty Design Research that does a lot of the body designs for (production and concept) vehicles built by Toyota Motor North America and has expertise in low-volume production. With that roof design, the way we integrated the sensors into the quarter panels, it’s a much more streamlined, integrated look. It’s a step in the right direction in terms of trying to have a more aesthetically-pleasing design.”
Make no mistake, it’s still a comparatively conspicuous appendage, but from even close distance, Platform 3.0’s rooftop panel looks no more flagrant than a ski rack, let’s say. Inspired, Calty said, by contemporary offroad motorcycle helmets, the roof panel is home to advanced new long-range lidar developed by Luminar Technologies. Its longer-wavelength (1550-nm) laser has a 200-meter (656-ft) range and more-effectively identifies low-reflectivity objects such as dark vehicles and tires. Four Luminar modules in the roof deliver that 200-meter range for 360-deg around the car — a significant lidar performance enhancement, noted Eustice.
Meanwhile, shorter-range lidar, supplied by Velodyne, is situated in the small housings low on each front quarter panel and on the front and rear bumpers. With those, “We can see objects close to the car; we can see right up to the car itself,” said Eustice. Perpendicular-looking radar emitters share each front quarter panel and also are placed low on the front and rear bumpers. Additionally, there is 360-degree camera coverage.
“On the perception side, it’s a very sensor-rich platform,” Eustice summarized. “One of the things we were very excited about was a big upgrade in the lidar technology we have.”
Fast-maturing technology
Eustice said the Platform 3.0 program is testimony to how quickly sensor and computing platforms are evolving. Barely a year ago, TRI launched Platform 2.0 for the Prius Challenge, a team competition in which participants optimized driving performance based on data and simulation from a connected Toyota Prius.
“We quickly changed the platform and the technology because we’re moving very rapidly,” Eustice said. “And we’re already thinking beyond platform 3.0.”
Is the end game to incorporate all the requisite automated-driving sensors fully with a vehicle’s “standard” sheetmetal — eliminate protrusions or other appendages dictated by today’s need to house sensors? Mega-supplier Magna International, for example, last year revealed the Max4 prototype automated-driving technology platform, its chief innovation being that the system’s SAE Level 4-capable sensor suite — including difficult-to-package lidar — can be incorporated entirely within a vehicle’s production bodywork.
“I think as the sensor technology matures, you can miniaturize it. That’s some of the advance we’re seeing,” asserted Eustice. “Thinking ahead to volume production,weareverymuchattheforefrontofbeing engaged with all the different technology providers when it comes to sensing technology. We’re well aware of what that roadmap looks like for how some of the technology is maturing. As we get closer to production, we have it as part of our roadmap to be working with the latest, smallest sensors that allow for blending into the body in a more aesthetically-pleasing way.”
But that’s probably the path for private cars. Vehicles intended for mobility as a service (MaaS) duty may not have the same aesthetic requirements, Eustice said. In fact, it may be desirable to have it evident a MaaS vehicle has autonomous capability. “In the space of Maas platforms, we’re used to seeing taxicabs that have the light bar on them,” he offers as an example of visible service-related design cues. “There’s probably more design freedom or flexibility” potential for Maas, he continued.
“Automotive designers’ roles have been pivoting toward thinking deeper and greater on how to design and apply automated driving technology for drivers and passengers,” Scott Roller, Senior Lead Designer at CALTY who worked on Platform 3.0, said in a release regarding the project. “It’s exciting to integrate the components in harmony with the car’s design.”
Less junk in the trunk
Not long ago, the computing hardware and wiring for full-autonomous capability would have demanded every square foot even of this sizeable sedan’s trunk space. Not now: Platform 3.0 contains all that into a comparatively small container in the trunk.
They may require less space, but those contents get to the heart of Toyota’s autonomy philosophy: what’s happening in the compute stack.
“That’s what TRI is delivering on,” Eustice enthused. “Most of our focus is on algorithmic development and innovation, providing the automated-driving system and software stack. That’s being done in-house — that’s one of the primary functions of TRI and what we deliver to Toyota.
“We have very nice miniaturized packaging. We’re pretty pleased with that. But we’re really still in kind of the R&D phase of what we’re doing. We want maximum flexibility, so we tend to use compute architectures that are kind of more off-the-shelf.” Hesaidthe current setup leaves plenty of computational headroom that means “we can be exploring on the algorithmic improvement side. As we get closer to production, that’s when you get into more-specialized versions of the compute hardware.”
He added that the clear trendline is to making large improvements in terms of power consumption.
But most intriguing, perhaps, is where Toyota is going with it all. The company has fervently espoused the mutually-supportive nature of its “Guardian” and “Chauffer” autonomy development paths. Guardian is a foundation philosophy of safety-focused automation in which the human driver maintains vehicle control and the automated-driving system operates in the background, monitoring for potential crash situations and can intervene when necessary. The Chauffer strategy builds on the guardian fundamentals by incorporating more-sophisticated degrees of automated control.
“The opportunity with Guardian is to use the same technology stack and have a sensor-rich car, a computationally-rich car, and really approach advanced driver-assistance systems from the perspective of a self-driving car, where the artificial intelligence is continuously monitoring the world around you. We think about artificial intelligence as a way to ‘guard’ the human,” said Eustice.
He added that “the more devilish side” of middle autonomy levels is that they require the human to override the system in potentially stressful situations; “We’re asking the human to guard the artificial intelligence (AI),” he said, adding, “We think that’s harsh. We believe Guardian is fundamentally different — and at TRI and Toyota, we’re actually going in a different direction than a lot of our competitors in this space. With Guardian, we think of AI to guard the human — we flip that question around.”
He added that TRI sees the approach as a “series” rather than a “parallel” way of thinking about autonomy.
“Most of the systems that we see basically are geared around series autonomy. With parallel autonomy — that’s how we describe Guardian — we really think about ‘blended control’ between the human and the autonomy and how best to use that technology to always help reduce the risk of the situation,” regardless of whether the driver or the machine is in control.
Eustice said TRI would begin building a modest fleet of Platform 3.0 vehicles starting in spring of 2018. The vehicles will be deployed for on-the-road testing in Michigan and around TRI’s headquarters in Los Altos, Calif.
Top Stories
INSIDERDefense
Army Launches CMOSS Prototyping Competition for Computer Chassis and Cards
ArticlesElectronics & Computers
Microchip’s New Microprocessor to Enable Generational Leap in Spaceflight...
INSIDERSoftware
The Future of Aerospace: Embracing Digital Transformation and Emerging...
ArticlesMaterials
Making a Material Difference in Aerospace & Defense Electronics
EditorialSoftware
Making Machines Software-Defined No Simple Task
INSIDERRF & Microwave Electronics
Germany's New Military Surveillance Jet Completes First Flight
Webcasts
Power
Phase Change Materials in Electric Vehicles: Trends and a Roadmap...
Automotive
Navigating Security in Automotive SoCs: How to Build Resilient...
Automotive
Is Hydrogen Propulsion Production-Ready?
Unmanned Systems
Countering the Evolving Challenge of Integrating UAS Into Civilian Airspace
Power
Designing an HVAC Modeling Workflow for Cabin Energy Management and XiL Testing
Defense
Best Practices for Developing Safe and Secure Modular Software