Data-Mining the Cabin for UX, Occupant Safety
Mitsubishi Electric leverages sensor data to develop a safer and more satisfying in-vehicle experience.
Mitsubishi Electric is using the data collected by driver-monitoring and various in-cabin sensors to help improve future vehicle safety, SAE Media recently learned at the supplier’s Northville, Michigan, R&D center. The ‘classroom’ was the cockpit of a concept 3-row SUV, designed to demonstrate new UX (user experience) technologies under development.
The concept cabin has two touchscreens: a 15.6-in (396-mm) center display and a smaller display to the left of the steering wheel. It also is fitted with two RGB (red/green/blue) infrared cameras, two radar conversion devices, two directional microphones, four thermal sensors and a steering column-mounted infrared driver monitoring system. All the sensors run on Qualcomm’s latest automotive-grade SoC. Non-safety-critical features are powered by the Android 12 operating system, while safety-critical features run on QNX’s real-time operating system.
“When we fuse the information from in-vehicle thermal and infrared cameras, we know a lot about what’s going on in the cabin,” explained Grigori Maistrenko, principal platform engineer for Mitsubishi Electric’s Innovation Group, Filament Labs. His team uses sensor data as its starting point for improving existing UX technologies and developing new applications such as gaze detection.
The system can determine if the driver is holding the steering wheel but is focused on a cellphone or looking out a side window, noted Sana Jafri, user experience architect for Filament Labs, Mitsubishi Electric’s Innovation Group. It responds to distracted driving with audio, visual, or other driver alerts. After a trip, drivers can see how well they focused via a visual diagram. The recap shows the percentage of time spent looking out a side window or gazing at a touchscreen, as well as the percentage of time devoted to other non-driving tasks.
“It’s eye-opening when you can see what you really did while driving,” Maistrenko said.
Focusing on ‘focus mode’
Driver overload is another problem the team is working to mitigate. “We went through a lot of studies and saw a pattern that distraction and cognitive load are major factors in accidents,” Jafri noted. The result is the UX system’s “focus mode.” When selected, the focus mode permits only navigation, HVAC, and media usage. It can learn from the driver, Jafri added – if the driver never uses media, then navigation and HVAC are the only essential in-cabin features that can be used.
Driver monitoring also can detect driver drowsiness or a medical emergency, such as the driver’s eyelids closing and the body slumping in the seat. That sensor data triggers a response. “Depending on the level of vehicle autonomy, the car can park itself. Or with advanced cruise control, the vehicle can be slowed and ultimately stopped,” she said.
If relevant health information was input as part of the driver-profile settings, first responders could see on the center screen a list of medications, allergies, blood type, and other personal medical info. Jafri noted that the center screen was designed and positioned so that someone standing outside the vehicle can read the screen’s text.
“It’s important that we show how information can be used if x, y, or z is detected, because that provides the driver and occupants with a safer in-vehicle experience,” Jafri explained.
Laser sensors and machine learning
Working with Sweden-based Klimator, the Mitsubishi Electric team is developing a system that uses a front-of-vehicle laser thermal sensor to detect ice, snow, or other problematic road conditions.
“We can show on the center display screen what’s been detected on the roadway 25 meters [82 feet] ahead, allowing the driver to make an informed decision about how fast to be driving based on current road conditions,” Jafri said. Mitsubishi Electric also is leveraging machine learning to detect road hazards such as potholes and traffic cones.
Products showcased on the company’s latest concept demonstrator are targeted for production application prior to 2028. “The varied technologies are suitable for electric vehicles and ICE-powered vehicles as well as ADAS (advanced driver assist system) and infotainment applications,” she noted.
Top Stories
INSIDERManned Systems
Turkey's KAAN Combat Aircraft Completes First Flight - Mobility Engineering...
INSIDERMaterials
FAA Expands Boeing 737 Investigation to Manufacturing and Production Lines -...
INSIDERImaging
New Video Card Enables Supersonic Vision System for NASA's X-59 Demonstrator -...
INSIDERManned Systems
Stratolaunch Approaches Hypersonic Speed in First Powered TA-1 Test Flight -...
INSIDERUnmanned Systems
Army Ends Future Attack and Reconnaissance Helicopter Development Program -...
ArticlesEnergy
Can Solid-State Batteries Commercialize by 2030? - Mobility Engineering...
Webcasts
AR/AI
From Data to Decision: How AI Enhances Warfighter Readiness
Energy
April Battery & Electrification Summit
Manufacturing & Prototyping
Tech Update: 3D Printing for Transportation in 2024
Test & Measurement
Building an Automotive EMC Test Plan
Manufacturing & Prototyping
The Moon and Beyond from a Thermal Perspective
Software
Mastering Software Complexity in Automotive: Is Release Possible...