User interface design for military AR applications

Designing a user interface for military situation awareness presents challenges for managing information in a useful and usable manner. We present an integrated set of functions for the presentation of and interaction with information for a mobile augmented reality application for military applications. Our research has concentrated on four areas. We filter information based on relevance to the user (in turn based on location), evaluate methods for presenting information that represents entities occluded from the user’s view, enable interaction through a top-down map view metaphor akin to current techniques used in the military, and facilitate collaboration with other mobile users and/or a command center. In addition, we refined the user interface architecture to conform to requirements from subject matter experts. We discuss the lessons learned in our work and directions for future research.

[1]  Steven K. Feiner,et al.  A touring machine: Prototyping 3D mobile augmented reality systems for exploring the urban environment , 1997, Digest of Papers. First International Symposium on Wearable Computers.

[2]  Steven K. Feiner,et al.  Cutaways and ghosting: satisfying visibility constraints in dynamic 3D illustrations , 1992, The Visual Computer.

[3]  J. Edward Swan,et al.  Adaptive User Interfaces in Augmented Reality , 2003 .

[4]  Steven K. Feiner,et al.  Exploring MARS: developing indoor and outdoor user interfaces to a mobile augmented reality system , 1999, Comput. Graph..

[5]  Steve Benford,et al.  A Spatial Model of Interaction in Large Virtual Environments , 1993, ECSCW.

[6]  Christian Sandor,et al.  Lessons Learned in Designing Ubiquitous Augmented Reality User Interfaces , 2007 .

[7]  J. Edward Swan,et al.  Military Applications of Augmented Reality , 2011, Handbook of Augmented Reality.

[8]  Ronald Azuma,et al.  Performance analysis of an outdoor augmented reality tracking system that relies upon a few mobile beacons , 2006, 2006 IEEE/ACM International Symposium on Mixed and Augmented Reality.

[9]  Harvey S. Hecht,et al.  Perceiving pictures : An interdisciplinary approach to pictorial space , 2001 .

[10]  Yuichi Ohta,et al.  Visualization Methods for Outdoor See-Through Vision , 2006, IEICE Trans. Inf. Syst..

[11]  Mica R. Endsley,et al.  TOOLS FOR SUPPORTING TEAM SA AND COLLABORATION IN ARMY OPERATIONS , 2003 .

[12]  Ryutarou Ohbuchi,et al.  Merging virtual objects with the real world: seeing ultrasound imagery within the patient , 1992, SIGGRAPH.

[13]  Tobias Höllerer,et al.  Interactive tools for virtual x-ray vision in mobile augmented reality , 2004, Third IEEE and ACM International Symposium on Mixed and Augmented Reality.

[14]  Steven K. Feiner,et al.  Information filtering for mobile augmented reality , 2000, Proceedings IEEE and ACM International Symposium on Augmented Reality (ISAR 2000).

[15]  Mark A. Livingston,et al.  Integration of georegistered information on a virtual globe , 2009, 2009 8th IEEE International Symposium on Mixed and Augmented Reality.

[16]  Christian Sandor,et al.  Improving Spatial Perception for Augmented Reality X-Ray Vision , 2009, 2009 IEEE Virtual Reality Conference.

[17]  Tobias Höllerer,et al.  Resolving multiple occluded layers in augmented reality , 2003, The Second IEEE and ACM International Symposium on Mixed and Augmented Reality, 2003. Proceedings..

[18]  Bruce H. Thomas,et al.  The Tinmith System - Demonstrating New Techniques for Mobile Augmented Reality Modelling , 2002, AUIC.