Data-Mining the Cabin for UX, Occupant Safety
Mitsubishi Electric leverages sensor data to develop a safer and more satisfying in-vehicle experience.
Mitsubishi Electric is using the data collected by driver-monitoring and various in-cabin sensors to help improve future vehicle safety, SAE Media recently learned at the supplier’s Northville, Michigan, R&D center. The ‘classroom’ was the cockpit of a concept 3-row SUV, designed to demonstrate new UX (user experience) technologies under development.
The concept cabin has two touchscreens: a 15.6-in (396-mm) center display and a smaller display to the left of the steering wheel. It also is fitted with two RGB (red/green/blue) infrared cameras, two radar conversion devices, two directional microphones, four thermal sensors and a steering column-mounted infrared driver monitoring system. All the sensors run on Qualcomm’s latest automotive-grade SoC. Non-safety-critical features are powered by the Android 12 operating system, while safety-critical features run on QNX’s real-time operating system.
“When we fuse the information from in-vehicle thermal and infrared cameras, we know a lot about what’s going on in the cabin,” explained Grigori Maistrenko, principal platform engineer for Mitsubishi Electric’s Innovation Group, Filament Labs. His team uses sensor data as its starting point for improving existing UX technologies and developing new applications such as gaze detection.
The system can determine if the driver is holding the steering wheel but is focused on a cellphone or looking out a side window, noted Sana Jafri, user experience architect for Filament Labs, Mitsubishi Electric’s Innovation Group. It responds to distracted driving with audio, visual, or other driver alerts. After a trip, drivers can see how well they focused via a visual diagram. The recap shows the percentage of time spent looking out a side window or gazing at a touchscreen, as well as the percentage of time devoted to other non-driving tasks.
“It’s eye-opening when you can see what you really did while driving,” Maistrenko said.
Focusing on ‘focus mode’
Driver overload is another problem the team is working to mitigate. “We went through a lot of studies and saw a pattern that distraction and cognitive load are major factors in accidents,” Jafri noted. The result is the UX system’s “focus mode.” When selected, the focus mode permits only navigation, HVAC, and media usage. It can learn from the driver, Jafri added – if the driver never uses media, then navigation and HVAC are the only essential in-cabin features that can be used.
Driver monitoring also can detect driver drowsiness or a medical emergency, such as the driver’s eyelids closing and the body slumping in the seat. That sensor data triggers a response. “Depending on the level of vehicle autonomy, the car can park itself. Or with advanced cruise control, the vehicle can be slowed and ultimately stopped,” she said.
If relevant health information was input as part of the driver-profile settings, first responders could see on the center screen a list of medications, allergies, blood type, and other personal medical info. Jafri noted that the center screen was designed and positioned so that someone standing outside the vehicle can read the screen’s text.
“It’s important that we show how information can be used if x, y, or z is detected, because that provides the driver and occupants with a safer in-vehicle experience,” Jafri explained.
Laser sensors and machine learning
Working with Sweden-based Klimator, the Mitsubishi Electric team is developing a system that uses a front-of-vehicle laser thermal sensor to detect ice, snow, or other problematic road conditions.
“We can show on the center display screen what’s been detected on the roadway 25 meters [82 feet] ahead, allowing the driver to make an informed decision about how fast to be driving based on current road conditions,” Jafri said. Mitsubishi Electric also is leveraging machine learning to detect road hazards such as potholes and traffic cones.
Products showcased on the company’s latest concept demonstrator are targeted for production application prior to 2028. “The varied technologies are suitable for electric vehicles and ICE-powered vehicles as well as ADAS (advanced driver assist system) and infotainment applications,” she noted.
Top Stories
INSIDERAerospace
How Airbus is Using w-DED to 3D Print Larger Titanium Airplane Parts
NewsAutomotive
Microvision Aquires Luminar, Plans Relationship Restoration, Multi-industry Push
INSIDERManned Systems
A Next Generation Helmet System for Navy Pilots
INSIDERDesign
New Raytheon and Lockheed Martin Agreements Expand Missile Defense Production
ArticlesManned Systems
Accelerating Down the Road to Autonomy
NewsPower
Ford Announces 48-Volt Architecture for Future Electric Truck
Webcasts
Electronics & Computers
Cooling a New Generation of Aerospace and Defense Embedded...
Test & Measurement
Battery Abuse Testing: Pushing to Failure
Internet of Things
A FREE Two-Day Event Dedicated to Connected Mobility
Unmanned Systems
Quiet, Please: NVH Improvement Opportunities in the Early Design Cycle
Transportation
Advantages of Smart Power Distribution Unit Design for Automotive &...
Aerospace
Sesame Solar's Nanogrid Tech Promises Major Gains in Drone Endurance



