The Augmented REality Sandtable (ARES)

Abstract : The Augmented REality Sandtable (ARES) is a research testbed that uses commercial off-the-shelf products to create a low-cost method of geospatial terrain visualization with a tangible user interface which can be used for simulation and training. The projection technology combined with a Microsoft Kinect sensor and a laptop is intended to provide an enhancement to traditional military sand tables. This report discusses the development of the system, its place among previous related work, and research methodology/experimentation efforts to assess impacts on human performance. It also provides an explanation of current, ongoing, and future research questions and capabilities. It discusses current collaborations and key leader engagements up to this point. The goal of this report is to provide a resource for researchers and potential collaborators to learn more about ARES and the opportunity to use its service-oriented architecture for the development of content for specific domains.

[1]  Eduardo Salas,et al.  Making decisions under stress: Implications for individual and team training. , 1998 .

[2]  John P. McIntire,et al.  Stereoscopic 3D displays and human performance: A comprehensive review , 2014, Displays.

[3]  Bertrand Schneider,et al.  Benefits of a Tangible Interface for Collaborative Learning and Interaction , 2011, IEEE Transactions on Learning Technologies.

[4]  Joyce Ma,et al.  Using a Tangible Versus a Multi-touch Graphical User Interface to Support Data Exploration at a Museum Exhibit , 2015, TEI.

[5]  Hiroshi Ishii,et al.  Tangible User Interfaces (TUIs): A Novel Paradigm for GIS , 2004, Trans. GIS.

[6]  Min Zhao,et al.  Research and design on power SUPPLY NETWORK sand table exercise system , 2014, 2014 China International Conference on Electricity Distribution (CICED).

[7]  Imran A. Syed,et al.  HCI using hand gesture recognition for Digital Sand Model , 2013, 2013 IEEE Second International Conference on Image Information Processing (ICIIP-2013).

[8]  Douglas H. Macpherson,et al.  The Virtual Sand Table: Intelligent Tutoring for Field Artillery Training , 2001 .

[9]  Doug Brown,et al.  Design of a Multi-Touch Tabletop for Simulation-Based Training , 2014 .

[10]  Hiroshi Ishii,et al.  Tangible bits: towards seamless interfaces between people, bits and atoms , 1997, CHI.

[11]  Chen Haihan,et al.  Reasearch on the technology of electronic sand table based on GIS , 2010, The 2nd International Conference on Information Science and Engineering.

[12]  John R. Surdu,et al.  The Virtual Sand Table. , 1997 .

[13]  Y Wang,et al.  Bringing Clay and Sand into Digital Design — Continuous Tangible user Interfaces , 2004 .

[14]  Patrick Reuter,et al.  GeoTUI: a tangible user interface for geoscience , 2008, TEI.

[15]  Russell S. Harmon,et al.  TanGeoMS: Tangible Geospatial Modeling System , 2010, IEEE Transactions on Visualization and Computer Graphics.

[16]  Jared Freeman,et al.  Critical thinking skills in tactical decision making: A model and a training strategy. , 1998 .

[17]  Robert van Liere,et al.  Reach the virtual environment: 3D tangible interaction with scientific data , 2005, OZCHI.

[18]  Samuel A. Kirby NPSNET: software requirements for implementation of a sand table in the virtual environment , 1995 .

[19]  Byung-Uk Choi,et al.  Virtual Tactical Map with Tangible Augmented Reality Interface , 2008, 2008 International Conference on Computer Science and Software Engineering.

[20]  Yinsheng Li,et al.  An application development environment for collaborative training sand table , 2014, Proceedings of the 2014 IEEE 18th International Conference on Computer Supported Cooperative Work in Design (CSCWD).

[21]  P. Milgram,et al.  A Taxonomy of Mixed Reality Visual Displays , 1994 .

[22]  Robert A. Sottilare,et al.  The Generalized Intelligent Framework for Tutoring (GIFT) , 2012 .

[23]  Atsunobu Narita,et al.  Collaborative simulation interface for planning disaster measures , 2006, CHI EA '06.

[24]  Bertrand Schneider,et al.  Preparing for Future Learning with a Tangible User Interface: The Case of Neuroscience , 2013, IEEE Transactions on Learning Technologies.

[25]  Yang Zhang,et al.  AR Sand Table with VSTAR System , 2011 .

[26]  L. R. Sutton High Tech versus Low Tech Training , 2005 .

[27]  Ronald Azuma,et al.  A survey of augmented reality" Presence: Teleoperators and virtual environments , 1997 .

[28]  Andreas Butz,et al.  Interactions in the air: adding further depth to interactive tabletops , 2009, UIST '09.

[29]  Philip R. Cohen,et al.  Creating tangible interfaces by augmenting physical objects with multimodal language , 2001, IUI '01.

[30]  Hideo Saito,et al.  Foldable augmented maps , 2010, 2010 IEEE International Symposium on Mixed and Augmented Reality.

[31]  Liu Xiaofeng,et al.  Making a Virtual Sand Table Based on Unity 3D Technique , 2014, 2014 13th International Symposium on Distributed Computing and Applications to Business, Engineering and Science.

[32]  Ronald Azuma,et al.  A Survey of Augmented Reality , 1997, Presence: Teleoperators & Virtual Environments.

[33]  Robin Jeffries,et al.  CHI '06 Extended Abstracts on Human Factors in Computing Systems , 2006, CHI 2006.

[34]  Helena Mitasova,et al.  GIS-based environmental modeling with tangible interaction and dynamic visualization , 2014 .

[35]  Tom Drummond,et al.  Localisation and interaction for augmented maps , 2005, Fourth IEEE and ACM International Symposium on Mixed and Augmented Reality (ISMAR'05).

[36]  Hiroshi Ishii,et al.  Illuminating Clay: A Tangible Interface with potential GRASS applications , 2002 .