Developing Scene Understanding Neural Software for Realistic Autonomous Outdoor Missions

Report No. ARL-TR-8173
Authors: Arnold D Tunick; Ronald E Meyers
Date/Pages: September 2017; 58 pages
Abstract: We present a deep learning neural network model software implementation for improving scene understanding of realistic autonomous outdoor missions in complex and changing environments. Scene understanding for realistic outdoor missions has been considered an unsolved problem due to the uncertainty of inferring the mutual context of detected objects and the changing weather, terrain, and environmental surroundings. We report proof-of-principle progress in autonomously searching for and recognizing key activities or scenarios by identifying both salient objects and relevant environmental settings depicted in outdoor scenes. Importantly, we demonstrate autonomous detection of targeted scenarios using neural network models separately trained on both objects and places image databases. In addition, using instructive analysis of 5 representative real-world mission scenarios, we show that adding dynamic environmental data and physics-based modeling could minimize unpredictably by constraining neural predictions to physically realizable solutions.
Distribution: Approved for public release
  Download Report ( 3.108 MBytes )
If you are visually impaired or need a physical copy of this report, please visit and contact DTIC.

Last Update / Reviewed: September 1, 2017