Gesture-Directed Sensor-Information Fusion Gloves

These wireless gloves provide communications capabilities for warfighters in hazardous environments.

Current chemical-protection gear for warfighters on the ground inhibits electronic communication via keyboards, cell phones, and remote-control devices. To improve communications capabilities for the warfighter wearing protective gear in hazardous environments, a series of eGloves has been developed with a view toward freeing the warfighter of the need to type on a keyboard while wearing a Mission-Oriented Protective Posture (MOPP) suit. The eGloves can help the warfighter transmit gestures with the hands and fingers from within the protective gear, or they can be used to transmit encoded ASCII characters.

The eGlove with motion sensors and CPU circuitry.
Gesture-Based Sensor-Information Fusion (GBSIF) refers to the fusing of sensor data collected from the environment with data from motion sensors on the eGlove. The eGlove features a CPU that is used to fuse hand and finger motions and positions into gestures. The same CPU can be used to fuse additional data from the environment. In GBSIF, the operator transports the sensor array but does not take an active role in determining the sensors that will participate in the fusion, or the target subjects about which the data will be collected, with the exception of the sensors mounted on the eGlove.

Data are collected from the environment and also from the glove sensors, and these data can be fused and integrated on a network site that differs from the user’s node. Thus, gesture sensor data and environmental data are collected, fused, and integrated where appropriate. However, the gestures themselves are not the primary driving force in selecting information sources and controlling the fusion process.

In contrast, Gesture-Directed Sensor-Information Fusion (GDSIF) includes GBSIF but extends it to the active participation of the eGlove operator to initiate sensor-information fusion. The concept of operation of the GDSIF is that the warfighter would point to a platform or another object in the battlespace using a gesture while wearing a GDSIF-equipped eGlove. The eGlove would be linked to reference sensors to determine orientation and azimuth of the operator’s arm. The eGlove also would use GPS to determine the operator’s geographic location. Gestures would cue sensors to send their data to the eGlove, where these data would be fused with the gesture that prompted the data collection. Fusion would be accomplished in the CPU mounted on the eGlove.

Simple gestures can be used to communicate information to improve situational awareness, send commands to personnel and to robots, and send commands to CBRN and other sensors in the battlespace. For example, the gesture to use most often for information fusion would be to point at a sensor-data source in the battlespace with the index finger extended and the other fingers touching the palm, (to distinguish it from similar gestures that use the whole hand to point). This pointing gesture, when recognized, would signal the sensor and trigger a data stream or a single reading from the designated sensor to the local common-data backbone. Successful transmission from the sensor would trigger haptic feedback on the operator’s glove indicating that the data set has been sent to the network. Continuing the example, the warfighter could repeat the pointing process with a second sensor and then a second gesture; for example, a fist with the arm held straight down would trigger a pre-determined sensor-information fusion process. Using the fusion-fist gesture in this manner would distinguish it from other gestures that employ a closed fist with the arm extended, which in some command contexts means “stop.” It also would avoid confusion with gestures in which the fist is held close to the chest.

This work was done by Marion G. Ceruti, Jeffrey Ellen, Gary Rogers, Sunny Fugate, Nghia Tran, Hoa Phan, Daniel Garcia, Bryan Berg, Emily Medina, and LorRaine Duffy of the Space and Naval Warfare Systems Center Pacific. For more information, download the Technical Support Package (free white paper) at www.defensetechbriefs.com/tsp  under the Physical Sciences category. NRL-0040



This Brief includes a Technical Support Package (TSP).
Document cover
Gesture-Directed Sensor-Information Fusion Gloves

(reference NRL-0040) is currently available for download from the TSP library.

Don't have an account?



Magazine cover
Defense Tech Briefs Magazine

This article first appeared in the April, 2010 issue of Defense Tech Briefs Magazine (Vol. 4 No. 2).

Read more articles from this issue here.

Read more articles from the archives here.


Overview

The document presents a position paper on Gesture-Directed Sensor-Information Fusion (GDSIF) aimed at enhancing communication and situational awareness for war fighters operating in hazardous environments, particularly those involving Chemical, Biological, Radiological, and Nuclear (CBRN) threats. The paper was presented at the Chemical and Biological Defense Physical Science and Technology Conference held from November 16-20, 2009, in Dallas, Texas.

The authors, including Marion G. Ceruti, Ph.D., and several collaborators, outline the challenges faced by war fighters who must communicate and react while wearing Mission-Oriented Protective Posture (MOPP) gear, which inhibits the use of traditional electronic communication devices like keyboards and cell phones. To address these challenges, the paper introduces the concept of electronic wireless-communication gloves, referred to as eGloves. These gloves are designed to facilitate gesture-based communication, allowing war fighters to transmit hand and finger gestures or encoded ASCII characters without removing their protective gear.

The document emphasizes the potential of eGloves to perform data fusion from environmental sensors, thereby improving the war fighter's ability to assess situations and threats in real-time. The authors propose a roadmap for future research, highlighting the need to develop a vocabulary of command gestures that can be easily understood and executed in various operational contexts. This includes building on existing gesture-based communication methods used by special forces.

The paper also discusses the integration of GDSIF data into the Common Operating Picture (COP), which is crucial for maintaining situational awareness among multiple war fighters. The authors acknowledge the challenges of ensuring the integrity and timeliness of data within the COP and suggest that further research and testing are necessary to overcome these obstacles.

In conclusion, the document outlines a vision for leveraging advanced technology, such as GDSIF and eGloves, to enhance communication and situational awareness for war fighters in hazardous environments. By addressing the limitations of current communication methods and proposing innovative solutions, the authors aim to improve the effectiveness and safety of military operations in the face of CBRN threats.