Audio Cues to Assist Visual Search in Robotic System Operator Control Unit Displays

Report No. ARL-TR-3632
Authors: Ellen C. Haas, Ramakrishna S. Pillalamarri, Christopher C. Stachowiak, and Michael A. Lattin
Date/Pages: December 2005; 25 pages
Abstract: Mission demands have made the robotics collaboration operator control unit (OCU) into a relatively dynamic, demanding, cognitively complex system where Soldiers must perform multiple tasks such as controlling multiple robots and processing large amounts of information in environments that sometimes contain high levels of noise. Research and modeling data indicate that audio display technologies would be very useful in OCU applications such as guiding visual display search. The purpose of this study was to examine the effectiveness of the integration of auditory display technologies in visual search tasks such as those that occur in robotic OCUs. Independent variables were audio signal mapping scheme, type of verbal positional cue, and visual target azimuth. Dependent variables were visual target search time and the National Aeronautics and Space Administration Task Load Index workload rating of the target search task. Participants were 36 students (15 males and 21 females) from Harford Community College. The results indicated that the use of auditory signal mapping and verbal positional cues significantly reduced visual display search time and workload and that positional cues mixed with specific audio mappings were the most efficient means of reducing search time. Specific design recommendations are made regarding the use of auditory signals in environments with narrow field-of-view visual displays.
Distribution: Approved for public release
  Download Report ( 0.162 MBytes )
If you are visually impaired or need a physical copy of this report, please visit and contact DTIC.

Last Update / Reviewed: December 1, 2005