On the Usability and Effectiveness of Different Interaction Types in Augmented Reality

One of the key challenges of augmented reality (AR) interfaces is to design effective hand-based interaction supported by computer vision. Hand-based interaction requires free-hands tracking to support user interaction in AR for which this article presents a novel approach. This approach makes it possible to compare different types of hand-based interaction in AR for navigating using a spatial user interface. Quantitative and qualitative analyses of a study with 25 subjects indicate that tangible interaction is the preferred type of interaction with which to determine the position of the user interface in AR and to physically point to a preferred option for navigation in augmented reality.

[1]  Gudrun Klinker,et al.  What do you do when two hands are not enough? interactive selection of bonds between pairs of tangible molecules , 2010, 2010 IEEE Symposium on 3D User Interfaces (3DUI).

[2]  Mi Jeong Kim,et al.  Technological advancements in synchronous collaboration: The effect of 3D virtual worlds and tangible user interfaces on architectural design , 2011 .

[3]  Jean Vanderdonckt,et al.  AudioCubes: a distributed cube tangible interface based on interaction range for sound design , 2008, Tangible and Embedded Interaction.

[4]  Soh-Khim Ong,et al.  Vision-Based Hand Interaction in Augmented Reality Environment , 2011, Int. J. Hum. Comput. Interact..

[5]  Till Nagel,et al.  Exploring faceted geospatial data with tangible interaction , 2010 .

[6]  Jae Yeol Lee,et al.  Hand gesture-based tangible interactions for manipulating virtual objects in a mixed reality environment , 2010 .

[7]  Paul A. Viola,et al.  Robust Real-time Object Detection , 2001 .

[8]  Sandeep Koranne,et al.  Boost C++ Libraries , 2011 .

[9]  Timothy F. Cootes,et al.  Active Appearance Models , 1998, ECCV.

[10]  Mohd Kufaisal bin Mohd Sidik,et al.  A Study on Natural Interaction for Human Body Motion Using Depth Image Data , 2011, 2011 Workshop on Digital Media and Digital Content Management.

[11]  Mark A. Livingston,et al.  User interface design for military AR applications , 2011, Virtual Reality.

[12]  Mark Billinghurst,et al.  Physically-based Interaction for Tabletop Augmented Reality Using a Depth-sensing Camera for Environment Mapping , 2011 .

[13]  James Gibson,et al.  Block Jam: A Tangible Interface for Interactive Music , 2003, NIME.

[14]  Paul A. Viola,et al.  Robust Real-Time Face Detection , 2001, International Journal of Computer Vision.

[15]  Sandra G. Hart,et al.  Nasa-Task Load Index (NASA-TLX); 20 Years Later , 2006 .

[16]  Axel Pinz,et al.  Robust Pose Estimation from a Planar Target , 2006, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[17]  R. Green,et al.  3D natural hand interaction for AR applications , 2008, 2008 23rd International Conference Image and Vision Computing New Zealand.

[18]  Takeo Kanade,et al.  An Iterative Image Registration Technique with an Application to Stereo Vision , 1981, IJCAI.

[19]  Pattie Maes,et al.  Augmenting Looking, Pointing and Reaching Gestures to Enhance the Searching and Browsing of Physical Objects , 2007, Pervasive.

[20]  Xavier Alamán,et al.  VirtualTouch: A Tool for Developing Mixed Reality Educational Applications and an Example of Use for Inclusive Education , 2014, Int. J. Hum. Comput. Interact..

[21]  Sean White,et al.  Interaction and presentation techniques for shake menus in tangible augmented reality , 2009, 2009 8th IEEE International Symposium on Mixed and Augmented Reality.

[22]  Luis A. Guerrero,et al.  Human–Objects Interaction: A Framework for Designing, Developing and Evaluating Augmented Objects , 2014, Int. J. Hum. Comput. Interact..

[23]  Weidong Huang,et al.  Recent Trends of Mobile Collaborative Augmented Reality Systems , 2014 .

[24]  Brian P. Bailey,et al.  Build your world and play in it: Interacting with surface particles on complex objects , 2010, 2010 IEEE International Symposium on Mixed and Augmented Reality.

[25]  K Sloan,et al.  The hand that rocks the cradle. , 1987, Film History.

[26]  Stephan Lukosch,et al.  Multimodal collaboration for crime scene investigation in mediated reality , 2012, ICMI '12.

[27]  Raimund Dachselt,et al.  Three-dimensional menus: A survey and taxonomy , 2007, Comput. Graph..

[28]  Stephan Lukosch,et al.  As if being there: mediated reality for crime scene investigation , 2012, CSCW '12.

[29]  Dieter Schmalstieg,et al.  Mobile collaborative augmented reality , 2001, Proceedings IEEE and ACM International Symposium on Augmented Reality.

[30]  Janet C. Read,et al.  Designing and testing a tangible interface prototype , 2007, IDC.

[31]  Tobias Höllerer,et al.  Vision-based interfaces for mobility , 2004, The First Annual International Conference on Mobile and Ubiquitous Systems: Networking and Services, 2004. MOBIQUITOUS 2004..

[32]  Steven K. Feiner,et al.  Opportunistic controls: leveraging natural affordances as tangible user interfaces for augmented reality , 2008, VRST '08.

[33]  Darius Burschka,et al.  Adaptive and Generic Corner Detection Based on the Accelerated Segment Test , 2010, ECCV.

[34]  Rafael Radkowski,et al.  Interactive Hand Gesture-based Assembly for Augmented Reality Applications , 2012, ACHI 2012.

[35]  Ava Fatah gen. Schieck,et al.  ARTHUR: A Collaborative Augmented Environment for Architectural Design and Urban Planning , 2004, J. Virtual Real. Broadcast..

[36]  Shih-Chung Kang,et al.  Use of Tangible and Augmented Reality Models in Engineering Graphics Courses , 2011 .

[37]  Jochen Ehnes A tangible interface for the AMI Content Linking Device — the automated meeting assistant , 2009, 2009 2nd Conference on Human System Interactions.

[38]  George Tzanetakis,et al.  Sonophenology: A Tangible Interface for Sonification of Geo-Spatial Phenological Data at Multiple Time-scales , 2010 .

[39]  Eva Cerezo,et al.  Bringing tabletop technology to all: evaluating a tangible farm game with kindergarten and special needs children , 2012, Personal and Ubiquitous Computing.

[40]  Sy-Yen Kuo,et al.  iCon: utilizing everyday objects as additional, auxiliary and instant tabletop controllers , 2010, CHI.

[41]  Olaf Oehme,et al.  Augmented Reality (AR) for Assembly Processes Design and Experimental Evaluation , 2003, Int. J. Hum. Comput. Interact..

[42]  Ivan Poupyrev,et al.  3D User Interfaces: Theory and Practice , 2004 .

[43]  Dragos Datcu,et al.  Free-hands interaction in augmented reality , 2013, SUI '13.

[44]  Olav W. Bertelsen,et al.  Augmented reality as a design tool for mobile interfaces , 2000, DIS '00.

[45]  Takeo Kanade,et al.  First-Person Vision , 2012, Proceedings of the IEEE.

[46]  T. Swart Design of a gesture controlled graphic interface for Head mounted Displays for CSI The Hague , 2012 .

[47]  Gerhard Rigoll,et al.  Static and Dynamic Hand-Gesture Recognition for Augmented Reality Applications , 2007, HCI.