The Digital Cockpit of the Future

How Human Machine Interface Technology Will Impact Avionics Displays

When Doug Hurley and Bob Behnken lifted off Earth in a SpaceX rocket in May 2020, there were a lot of firsts: it was the first manned rocket launch on American soil since the shuttle program ended in 2011, the first NASA flight with a private company, and the first time a space shuttle cockpit looked completely different to anything we’ve ever seen before. The human-machine-interfaces (HMI) of airplane cockpits will soon follow the same path. New technologies like 3D volumetric displays or virtual tactile sensation projections offer exciting possibilities for the cockpit of the future. Yet new tools will be necessary to develop, test and integrate those technologies in an economically sensible and safe way.

Robert L. Behnken, better known as Bob Behnken, has spent a lot of time in space. He first flew to the International Space Station (ISS) in March 2008 with the space shuttle Endeavour and took a second trip in 2012. Then the space shuttle program ended and Behnken had to wait eight years before he could return to the ISS. When he launched in SpaceX’s Crew Dragon capsule on May 30, 2020, everything around him had changed. While spaceship cockpits already looked like the future in science fiction films of the 1970s, the decade in which Behnken was born, the cockpits of space shuttles that flew up to the end of the US space program were reminiscent of jumbo jets — just crammed with more buttons, switches, knobs and flashing lights that shone as bright as the advertising signs in Times Square.

SpaceX has radically changed the cockpit design with the Crew Dragon. Of the more than 1,000 buttons and switches in the old space shuttles, now only around ten remain, all for potential emergencies only. The change in spaceship cockpits anticipates what will soon be seen in those of airliners. Software and automation have enabled a step forward in cockpit design, one that is stripped down to the essentials without sacrificing functionality.

The essence of the way an airplane is flown has hardly changed since it was first invented. The connection between man and machine via displays, switches, controllers, and lights has expanded, but remains fundamentally the same. The Crew Dragon represents a revolution in the human-machine interface in space flight. And the digital revolution will shortly take hold of aircraft HMI too. Yet which technologies will enter the cockpit?

HMI Development Platform

There are many technologies being developed by countless startups around the world that will connect people with machines in new ways. Augmented reality, voice control, virtual co-pilots powered by AI or even computer-brain interfaces. Today, it is difficult to predict which technologies will prevail. This makes it even more crucial for aircraft manufacturers to have a design platform that makes it easy to implement future technologies and allows efficient trial-and-error testing during development. An HMI design tool allows for new technologies to be used in an early stage of their development phase and offers a programmable interface for connecting third party software and hardware systems.

Software and automation have enabled a step forward in cockpit design, one that is stripped down to the essentials without sacrificing functionality.

Incari is an HMI development platform that allows complex HMIs to be built without writing a single line of code. The node-based no-code approach allows non-programmers to build their own logic. This solution produces native C++ code that is compiled for the appropriate system.

Incari is segmented into the authoring system, Incari Studio, which is used to create HMIs, and Incari Player, a runtime environment that can be used to load and play HMIs. Incari Player is currently available for the x86, x86_84, ARMv7HF 32BIT, ARMv8 64BIT architectures and can be extended to other architectures as needed. The technology follows the 3D-first approach and works with a native OpenGL render engine. This allows 2D and 3D content to be assembled at the same time. The platform supports the programmable render pipeline starting from OpenGL ES 3.0 or OpenGL 3.3.

The system’s compatibility with new technologies comes from the wide range of ports Incari offers to communicate and interact with external systems. These include CAN-Bus, the standard web protocol HTTP, MQTT, as well as serial communication. Any device connected via the serial port can be used with the software platform, such as GPS, proximity, and temperature sensors. Thus, the supported feature set is continuously expanding and future hardware solutions can be integrated. This means working closely with developers of gesture control, eye tracking, ambient lighting or stereoscopic screen technology. Although these technologies may look futuristic, new HMIs in airplanes will not be introduced purely for the sake of it. They will help enhance safety.

3D Volumetric Displays

Technology has always helped minimize safety risks in aviation by assisting pilots. The accident statistics of the 1990s were driven primarily by Controlled Flight Into Terrain (CFIT). Pilots lost spatial situational awareness and planes crashed into terrain. The solution was a two-dimensional representation of the terrain on the navigation screen. Since then, this cause of accident has been eliminated, as HMI-researcher Peter Marcus Lenhart at the Zurich University of Applied Sciences has shown.

New technology has the potential to further improve situational awareness in aircrafts. The Australian company Voxon is building 3D volumetric display technology. In Voxon’s case, the term volumetric relates to the fact that the objects the technology creates do not reside on a screen but physically exist within a volume or 3D space. Where other 3D display technologies use multiple views of a scene to create a stereoscopic view, Voxon technology physically builds a model of the scene using millions of points of light.

How does it work? It operates much like a 3D printer. A Voxon display takes 3D data and slices it up into hundreds of layers. Those layers are then projected one at a time onto a specially designed highspeed reciprocating screen. Due to persistence of vision, the human eye blends the images together, and the result is a true 3D image that can be viewed in the same way one would view a real object, from any angle, and without the need for special effects, headgear, or glasses.

Ultrasound Technology

Not only are there new technologies to help us visualize information, there are also new ways for humans to interact with that information. The US company, Ultraleap, has developed a product that is making it possible to project virtual tactile sensations onto human hands. A small speaker emits ultrasound waves, which are too high a frequency for humans to hear. Many of those small speakers are then put into an array and, using algorithms, triggered with very specific time differences. These time differences mean the ultrasound waves arrive at the same point in space, at the same time, called the focal point. Where the focal point is positioned is programmable in real time and can change position from instant to instant.

Ultraleap uses a hand tracking device to track the exact position of a human hand and aim the focal point at a spot on it. The combined ultrasound waves have enough force to create a tiny dent in the skin. This pressure is used to create a vibration that nerve endings in the hand can detect. By moving the pressure points around, Ultraleap technology creates tactile effects in mid-air. These pressure points can be used to create a wide range of tactile effects, from sculpting virtual lines and shapes to forming 3D controls in mid-air.

Example of an HMI created using Incari Studio software.

New Design Tools

Each of these technologies could transform flying as we know it today, to a greater or lesser extent. However, the high demand for safety and the associated high development costs for aircraft manufacturers represent a unique challenge to the industry. HMI development platforms like Incari Studio can solve this problem by enabling designers and engineers to work together from the idea stage, through the prototype and test phases, to the creation of a production-ready product. They can work in a shared software environment in which even users who lack programming skills can design complex interfaces. Using this approach, subsequent adjustments, which are often time-consuming and cost-intensive due to an inability to implement them — something that is usually only discovered later in the process — are eliminated.

The ability to work quickly and iteratively with new software platforms leads to a better and safer product. This is critical in aviation and leads to confidence in new possibilities. At the same time, costs are reduced so that R&D budgets do not grow immeasurably.

Things are changing fast and the HMI revolution in air travel is right around the corner. The confidence in new forms of human-machine interaction will soon be as great in airplanes as it now is in space travel. And if so, maybe the cockpit of the plane that takes Bob Behnken to Cape Canaveral for his next trip to the ISS will look a lot more like the spacecraft he is already traveling into space with.

This article was written by Alexander Grasse, Chief Product Officer and Co-Founder, Incari GmbH (Berlin, Germany). For more information, visit here .