ARL develops and evaluates "hands-free, eyes-free and mind-free" methods to aid Soldiers

June 02, 2014

By Joyce M. Conant, ARL Public Affairs

Story Highlights

  • Vibratory signals are communicated through tactile actuators placed inside of the haptic device
  • Tactile actuators could be placed in any number of objects—such as a glove, belt, inside the helmet or vest
  • Allows Soldiers to perform mission tasks without taking their eyes off a target

Researchers at the U.S. Army Research Laboratory's Human Research and Engineering Directorate continue to develop and evaluate methods for navigation and communication that are "hands-free, eyes-free and mind-free" to aid Soldiers in the field.

Soldiers wear a lightweight belt around their torso, containing miniature haptic technology. The belt provides vibratory or tactile cues that allow the Soldier to navigate and map coordinates and receive communications, while still carrying a weapon. Enabling them to move and communicate while keeping their visual map displays in their pockets and their eyes on their surroundings. This allows the Soldier to communicate covertly, while maintaining attention on potential threats.

The vibratory signals are communicated through tactile actuators placed inside of the haptic device. Navigation signals correspond to vibrations or pulses that tell the Soldier which direction to go. As long as the tactile sensation is felt at the front of the torso, the Soldier moves forward. If the sensation is at the side or back, the Soldier simply turns until the GPS-enabled signal is felt at the front. At the same time, communications are also provided by tactile means that can be from other Soldiers or more intelligent ground robots—such as status updates or warnings regarding potential threat. This allows them to perform mission tasks without taking their eyes off a target, by more intuitively "sensing" the direction or activity they are supposed to perform.

The vibration, or sensation the Soldier feels, determines what the Soldier is supposed to do or the task they are to perform and is based on the tactile language that is developed—such as with Morse code. These patterns are developed to be distinct, unique and consistent with the information at hand, to allow the Soldier to quickly and easily interpret the cues. For example, hand signal information or specific messages such as "robot battery low" can be assigned to patterns, learned and recognized.

One may think of the various vibration signals as similar to different ring tones on your cellular phone. By the ringtone, you can tell who is calling you without actually looking at the screen to see the person's name or number. It is the sound that alerts you, not the actual sight of the phone number.

The tactile actuators could be placed in any number of objects—such as a glove, belt, inside the helmet or vest. Researchers from HRED's Fort Benning, Ga., field element are currently testing such tactile systems for navigation and/or communication during mission-relevant exercises to determine the effectiveness of these devices while wearing them and seeing how they perform during actual use. Soldiers quickly learn the system, attaining proficiency with the signals within 10–15 minutes.

In a very recent effort, 36 Soldiers participated in an assessment of the Engineering Acoustics, Inc., Casselberry, Fla., NavCom system at Fort Benning, Ga., to evaluate simultaneous presentations of navigation and robot communication/monitoring using tactile patterns comprised of two different types of advanced tactors during operationally relevant scenarios. Soldiers were asked to complete several combat-related tasks during this exercise.

The scenarios involved night-land navigation on equivalent courses of approximately 900 meters. While navigating from waypoint to waypoint, Soldiers also received communications from a hypothetical autonomous robot regarding either the robots status or a possible threat detected by the robot. Additionally, Soldiers negotiated exclusion zones and identified enemy targets along the course.

Accuracy data was collected automatically by the Soldier system, such as time to each waypoint and accuracy to each waypoint. Observer-based data collection included accuracy of robot alerts, number of times Soldiers looked down at their screen, took their hand off of their weapon and correctly identified a target on the course. Subjective data was also collected after each mission in the form of a workload assessment and questionnaire followed by an after action review at the end of the night.

"Data are still being compiled, however, it is clear that Soldiers rarely looked at the visual display when the tactile belt was 'on,' Soldier feedback was very positive," said Gina Hartnett, from HRED's Fort Rucker field element. "This assessment gave us a great example of how a device can free up the senses so effectively. Course times were faster on tactile assisted navigation legs. Soldiers reported being more situationally aware of their surrounding because they rarely if ever had to take their eyes off of their environment. Additionally, not having to interact with a visual display, allowed their hands to stay on their weapon."

Harnett said that some specific comments from the Soldiers included: "I was more aware of my surroundings;" "I liked being able to concentrate on other things and not the screen;" "I don't land nav much, but this made it a no-brainer;" and "I loved the belt, it worked perfectly."

"This stream of research is very dear to my heart. It's not often a Soldier can pick up a piece of equipment, be trained in 5–10 minutes and have a very positive experience," said Dr. Linda Elliott, from HRED's Fort Benning field element. "In a previous night study, Soldiers said they were blind (night, fog, rain, night vision devices fogging up, etc.) and the belt led them straight to point, allowing them to focus attention on their surroundings.

"Given the three basic Soldier tasks—move, shoot, communicate—this system supports all, moving more quickly, accurately, finding more targets in the environment and more effective covert communications. At the same time, we are trying to collect more basic data, to identify the factors that make a tactile signal 'salient'—easily felt, immediately recognized and distinguished from others. That has to do with the type of tactile signal strength (and other engineering factors), individual differences (such as fatigue) and environmental factors (stress, movement, etc.)."

ARL's history with tactile and haptic systems

Tactile systems for military performance have demonstrated their potential with regard to capability achievement and performance advantage, across a number of applications. Experiments and demonstrations have been conducted across a wide range of settings, from laboratory tasks to high-fidelity simulations and real-world environments. Operators of these various tactile systems have successfully perceived and interpreted vibrotactiles cues in adverse, demanding and distracting situations, such as combat vehicles, aircrew cockpits, high-speed watercraft, underwater experiments and during strenuous movements. The relevance of these systems for Soldier effectiveness is high. Human factors studies of Soldier roles have shown significant overloading of the visual and auditory sensory modes in jobs such as Abrams tank commanders and drivers, ground robot controllers and unmanned aircraft operators. Dismounted Soldiers consistently experience heavy cognitive and visual workload, particularly during navigation and patrol and under conditions of high stress and time pressure.

Several ARL studies have been conducted within the context of Soldier land navigation to investigate effects of tactile cues in context. Many of these studies have been published as ARL technical reports, through investigators such as Elizabeth Redden, Rodger Pettitt, Andrea Krausman, Tim White, Kimberly Myles, Jessie Chen, Monica Glumm, Brad Davis and Mary Binseel. These studies investigated an array of issues with regard to Soldier applications, including different mission context (e.g., land navigation, strenuous movements, command and control decision making, and within vehicles) and different wearable context (e.g., belt, vest, back array, head).

Elliott said that subsequent experiments proved the value of tactile systems to support Soldier navigation and communication, but at the same time, systems must be improved and refined before they can be practical in combat situations.

"They must be made lightweight, comfortable, rugged, networked within a command and control system and they must be easy to use and easy to maintain," said Elliott. "As tactile displays are increasingly used for communication of more complex and multiple concepts, it will become evident that tactile and multisensory systems in general must be designed for rapid and easy comprehension."

 

Last Update / Reviewed: June 2, 2014