Embedded Mobile Augmented Reality Trainer Within a Distributed HLA Simulation
NAVAL RESEARCH LAB WASHINGTON DC
Pagination or Media Count:
The Battlefield Augmented Reality System BARS is a mobile augmented reality system that displays battlefield intelligence information to a dismounted warrior. The primary BARS components include a wearable computer, a wireless network, and a tracked, see-through Head Mounted Display HMD. The computer is responsible for generating graphics to the HMD that appear, from the users perspective, to exist in the surrounding environment. Thus, a building could be augmented to show, for example, its name or a plan of its interior. Simulated entities with which the BARS user might interact can also be displayed. Joint Semi-Automated Forces JSAF is a software system capable of simulating entity level platforms and behaviors in a heterogeneous, distributed computing environment. This system implements communication between distributed components of the battle space with the High Level Architecture HLA. A very important aspect of each is that interactions spawned by live, virtual, and constructive forces are seamlessly integrated. In our experiment, we bridge the gap between a user wearing the BARS hardware and constructive entities. We use the Real-Time Infrastructure RTI to distribute and translate entity pose and appearance data between the BARS data distribution system and JSAF. This interface allows a BARS user to interact with the SAF entities in real time, including armed engagement. We will discuss the design and implementation of these two interfaces, including special considerations unique to each, and their applications with respect to embedded training.
- Computer Programming and Software