Developing Scene Understanding Neural Software for Realistic Autonomous Outdoor Missions
Technical Report,01 May 2016,01 Sep 2017
US Army Research Laboratory, Computational and Information Sciences Directorate Adelphi United States
Pagination or Media Count:
We present a deep learning neural network model software implementation for improving scene understanding of realistic autonomous outdoor missions in complex and changing environments. Scene understanding for realistic outdoor missions has been considered an unsolved problem due to the uncertainty of inferring the mutual context of detected objects and the changing weather, terrain, and environmental surroundings. We report proof-of-principle progress in autonomously searching for and recognizing key activities or scenarios by identifying both salient objects and relevant environmental settings depicted in outdoor scenes. Importantly, we demonstrate autonomous detection of targeted scenarios using neural network models separately trained on both objects and places image databases. In addition, using instructive analysis of 5 representative real-world mission scenarios, we show that adding dynamic environmental data and physics-based modeling could minimize unpredictably by constraining neural predictions to physically realizable solutions.
- Computer Programming and Software