Researchers improve human-AI interaction for combat vehicles

Artificial Intelligence, or AI-enabled systems are a part of everyday life -- people use AI software to figure out the best way to navigate to new places, ask virtual agents in phones to answer questions, as well as have robots patrol the supermarket to ensure shelves stay stocked. The military is no exception -- it expects robots to do dull, dirty dangerous jobs. AI support tools help solve complicated problems.

At the U.S. Army Combat Capabilities Development Command’s Army Research Laboratory, researchers developed Transparent Multi-Modal Crew Interface Designs, which are part of the laboratory’s Human-Autonomy Teaming Essential Research Program. The project involves the development of technologies that support Soldiers’ ability to team with AI-enabled robotic tanks.

“AI does not solve problems the way that humans do -- the algorithms or reasoning processes AI uses are different from those of humans, which often leads them to produce different solutions to problems,” said Dr. Brandon Perelman, Army researcher. “Because they do not typically exhibit human facial expressions or methods of communication, it may be difficult to understand how they arrived at an answer, what they’re currently trying to accomplish, or even what the AI is currently doing -- for a long time we’ve known the value in answering questions like, did it freeze or is it just thinking? This goes much further.”

The research goal is to operationalize the laboratory’s prior and current research to improve human understanding of AI actions, intentions, goals and general reasoning for the Next Generation Combat Vehicle.

“We implemented several concepts designed to make the AI more transparent and easier to understand, implemented as pieces of software that augment the warfighter machine interface used to control the Next Generation Combat Vehicle,” Perelman said. “These pieces of software were tested with active duty Soldiers in the Information for Mixed Squads Laboratory; transparency concepts that were deemed sufficiently mature were transferred to Crew Optimization and Augmentation Technologies, or COAT project, led by CCDC Ground Vehicles System Center, where they are ruggedized and tested on their motion platform.”

COAT is developing software -- such as decision support tools, interface designs and interaction protocols, to enable coordinated platoon-level maneuver for mixed manned-unmanned formations.

“Techniques that increase Soldier understanding and prediction of AI agent actions, intentions, goals and general reasoning are a key component to achieving our overall objective,” said Chris Mikulski, COAT S&T project manager at CCDC GVSC. “Transparency concepts developed by [CCDC] ARL are being integrated into COAT’s vehicle demonstrator for field testing this year. They will also be integrated in Phase II of the Mission Enabling Technologies-Demonstrator.”

He said close collaboration with the laboratory enabled them to accelerate the maturation of core technologies, so that they can get them in the vehicle and into the hands of Soldiers sooner.

The transparency concepts are then integrated into real-world vehicles as part of the MET-D program.

“We believe this work is critical to the success of that program and its goals,” Perelman said. “We know that crew size reduction and robotics are major focuses across the military.”

Researchers said the state-of-the-art in AI, automation is not quite there yet. Perelman believes that functioning at reduced crew sizes (e.g., going from 4-to-1 or 3-to-1 down to 2-to-1) will be possible due to advances in those technology areas, but prior research showed that introducing autonomy brings its own challenges.

“In the civilian commercial sector, such as with self-driving cars, it is not always clear to users how the AI makes decisions, why it fails, and what appropriate interaction with these systems looks like,” Perelman said. “Add in the fact that military settings are far less structured than the civilian world, the terrain is more rugged, sensors may be obscured, and an opponent may be actively trying to deny our ability to act...all of these challenges will be compounded.”

Soldiers must have the tools they need to develop solid functional relationships with their AI teammates to understand how to appropriately use autonomy.

“This is very important to me as an Army researcher because, if we can help Soldiers understand the behavior of these systems better so they can learn to use them more appropriately, we can reduce the number of Soldiers we send into harm's way, as well as reduce the exposure of those Soldiers during missions,” Perelman said.

The next step for this project is to deliver the remainder of the developed concepts to MET-D via COAT. The team has a multi-modal (auditory and tactile) cueing system they developed to help Soldiers maintain awareness of the robotic vehicle’s state and mitigate crew workload.

“The hardware and software developed at the laboratory will be broadly applicable across our Essential Research Program projects, and will serve as a springboard for future concept development,” Perelman said.