Simulation Vs. Organizational Inertia
The chief technical officer of simulation giant VI-grade says that even as it offers a wide range of solutions, some companies are slow to believe in the proven reductions in development time and cost.
At VI-grade’s Zero Prototypes Summit in Udine, Italy, 270 of the company’s partners, customers and suppliers shared their successes using the company’s simulators in tire, ADAS and full vehicle development. Another 800 people attended online.
The company also unveiled its new Compact Full Spectrum Simulator that allows engineers to test ride motion and NVH in a single, small-footprint simulator. SAE Media spoke at length with CTO Diego Minen about what’s next for VI-grade and what slows adoption of the technology, which is documented to speed development time and save money over the mule-and-prototypes standard. (See “The Race to Zero Prototypes” in the March issue.)
SAE Media: You have so many different types of simulators, from simple desktop computer models all the way up to multi-million-dollar cable-driven machines that can simulate combined loading events. Are you trying to ensure access to your technology across multiple price points?
Minen: We understand that it is difficult for a company approaching this for the first time to do the big investment, especially for smaller companies. But we can protect the investment by providing technology that is modular, scalable and configurable.
Here at the summit, we’re hearing a lot about upgradability, both in software and in simulator hardware. Is that because some customers worry about falling behind after locking in a major purchase like a simulator?
Yes. Our philosophy is that instead of saying you have this small machine and you upgrade to the full machine, you can add pieces from where you start. The central point is the environment surrounding the driver and it could be active or passive. And then you can connect different types of technology to these, let's say computer or chassis or position of the driver. Then, starting with an experience that is only visual and sound, you can make the experience even larger. The visual experience could start only with visual and sound, and then adding vibration and movement.
What is the biggest hurdle or hurdles facing wider adoption of simulators to replace parts of the development cycle?
There is no shared understanding of what a simulator is. I always use the analogy of the bicycle at the end of the 18th century with the big front wheel [the penny farthing]. Then 1 million frames later it becomes the bicycle [that we know of today, with equal-sized wheels]. Everybody sketching a bicycle would draw the same thing. Now, if I asked you to sketch a simulator... Second: People over their 60s like me, grew up with Bill Gates still in his garage, right? Our managers in bigger OEMs are stillin their 60s and they know a car has been a car – four wheels with inflated tires and an internal combustion engine. Changing from current development standards is difficult for them.
An impact on big OEMs of simulation is not only the technology, but how do I manage my troops? How do I organize my work? I am coming from years and years of certain procedures. [CAE, then prototyping, then multiple iterations.] Then in eight years or less the technology changes. Which means somebody's losing power in big organizations. Then you find reluctance and you have to understand which way to go. For me, this is what is slowing down the massive adoption of this technology. Because in three years, there is a new [development technology]. For us, it takes patience.
At VI-grade, we change because the new generation, of course, has grown up more in the world of augmented reality and mixing one or the other. So there is much more instinctive acceptance of simulation. I expect in 10 years looking back, some will say ‘why did we wait so long?’
How does the how does the rapidly advancing world of AI affect simulation? Is it particularly useful for ADAS testing?
That is something we have introduced for legacy management of various subsystems. We have artificial intelligence that learns about subsystem delays and tries to reduce the differences in data from different sensors. I’m going to say that there isn’t anything specific at the moment.
We are using sensor fusion to combine telemetry data with driver data to ensure the comfort of the simulator driver.