Experimental Analysis of Single Mode Switching Techniques in Augmented Reality

This paper presents an empirical evaluation of mode switching techniques for Augmented Reality (AR) headsets. We conducted a quantitative analysis exploring five techniques for switching between modes: a hardware button press, 3D virtual button press, non-preferred hand, reach depth, and voice. Results from our study support the benefits of non-preferred mode switching, showing non-preferred and depth mode switching to be faster than voice and the virtual button techniques. Depth, however, had significantly more errors compared to the other techniques. Our work lays a foundation for developers to design new mode switching techniques and guides the design of current hardware solutions around choosing techniques that best compliment application use.

[1]  Ferran Argelaguet,et al.  Visual feedback techniques for virtual pointing on stereoscopic displays , 2009, VRST '09.

[2]  Mike Sinclair,et al.  Interaction and modeling techniques for desktop two-handed input , 1998, UIST '98.

[3]  Wendy E. Mackay,et al.  BiTouch and BiPad: designing bimanual interaction for hand-held tablets , 2012, CHI.

[4]  Patrick Baudisch,et al.  The springboard: multiple modes in one spring-loaded control , 2006, CHI.

[5]  Abigail Sellen,et al.  Two-handed input in a compound task , 1994, CHI 1994.

[6]  Eric Lecolinet,et al.  MicroRolls: expanding touch-screen input vocabulary by distinguishing rolls vs. slides of the thumb , 2009, CHI.

[7]  Chris Harrison,et al.  TapSense: enhancing finger interaction on touch surfaces , 2011, UIST.

[8]  Ravin Balakrishnan,et al.  Pressure widgets , 2004, CHI.

[9]  Xiang Cao,et al.  Grips and gestures on a multi-touch pen , 2011, CHI.

[10]  Zhongke Wu,et al.  Guidance rays: 3D object selection based on multi-ray in dense scenario , 2013, VRCAI '13.

[11]  Andy Cockburn,et al.  User-defined gestures for augmented reality , 2013, INTERACT.

[12]  Geehyuk Lee,et al.  ForceDrag: using pressure as a touch input modifier , 2012, OZCHI.

[13]  Yi Yang,et al.  Depth-Based Hand Pose Estimation: Data, Methods, and Challenges , 2015, 2015 IEEE International Conference on Computer Vision (ICCV).

[14]  Michel Beaudouin-Lafon,et al.  SPad: a bimanual interaction technique for productivity applications on multi-touch tablets , 2014, CHI Extended Abstracts.

[15]  Patrick Baudisch,et al.  Hover widgets: using the tracking state to extend the capabilities of pen-operated devices , 2006, CHI.

[16]  J. J. Higgins,et al.  The aligned rank transform for nonparametric factorial analyses using only anova procedures , 2011, CHI.

[17]  Eyal Ofek,et al.  Haptic Revolver: Touch, Shear, Texture, and Shape Rendering on a Reconfigurable Virtual Reality Controller , 2018, CHI.

[18]  Noritaka Osawa Two-Handed and One-Handed Techniques for Precise and Efficient Manipulation in Immersive Virtual Environments , 2008, ISVC.

[19]  Enrico Rukzio,et al.  FaceTouch: Enabling Touch Interaction in Display Fixed UIs for Mobile Virtual Reality , 2016, UIST.

[20]  Ivan Poupyrev,et al.  The go-go interaction technique: non-linear mapping for direct manipulation in VR , 1996, UIST '96.

[21]  Abigail Sellen,et al.  Two-handed input in a compound task , 1994, CHI Conference Companion.

[22]  T. Ichikawa,et al.  Egocentric Object Manipulation in Virtual Environments: Empirical Evaluation of Interaction Techniques , 1998, Comput. Graph. Forum.

[23]  Hemant Bhaskar Surale,et al.  Experimental Analysis of Mode Switching Techniques in Touch-based User Interfaces , 2017, CHI.

[24]  Patrick Baudisch,et al.  Design and analysis of delimiters for selection-action pen gesture phrases in scriboli , 2005, CHI.

[25]  Eric Saund,et al.  Stylus input and editing without prior selection of mode , 2003, UIST '03.

[26]  Martin Behrens,et al.  Presstures: exploring pressure-sensitive multi-touch gestures on trackpads , 2014, CHI.

[27]  Eric Kolstad,et al.  Egocentric depth judgments in optical, see-through augmented reality , 2007, IEEE Transactions on Visualization and Computer Graphics.

[28]  Shwetak N. Patel,et al.  GripSense: using built-in sensors to detect hand posture and pressure on commodity mobile phones , 2012, UIST.

[29]  Andrea Bunt,et al.  A model of non-preferred hand mode switching , 2008, Graphics Interface.

[30]  Dong Hyun Jeong,et al.  Developing an efficient technique of selection and manipulation in immersive V.E. , 2000, VRST '00.

[31]  Steven K. Feiner,et al.  Perceptual issues in augmented reality revisited , 2010, 2010 IEEE International Symposium on Mixed and Augmented Reality.

[32]  Daniel J. Wigdor,et al.  Combining and measuring the benefits of bimanual pen and direct-touch interaction on horizontal interfaces , 2008, AVI '08.

[33]  Jong-Soo Choi,et al.  Wearable augmented reality system using gaze interaction , 2008, 2008 7th IEEE/ACM International Symposium on Mixed and Augmented Reality.

[34]  Xiaojun Bi,et al.  An exploration of pen rolling for pen-based interaction , 2008, UIST '08.

[35]  Jun Rekimoto,et al.  Dual touch: a two-handed interface for pen-based PDAs , 2000, UIST '00.

[36]  Frits H. Post,et al.  Using the Wii Balance Board#8482; as a low-cost VR interaction device , 2008, VRST '08.

[37]  Markus Funk,et al.  HoloLens is more than air Tap: natural and intuitive interaction with holograms , 2017, IOT.

[38]  Kenton O'Hara,et al.  Proceedings of the 14th international conference on Human-computer interaction with mobile devices and services , 2012 .

[39]  Richard A. Bolt,et al.  “Put-that-there”: Voice and gesture at the graphics interface , 1980, SIGGRAPH '80.

[40]  Yang Li,et al.  Experimental analysis of mode switching techniques in pen-based user interfaces , 2005, CHI.

[41]  Eyal Ofek,et al.  NormalTouch and TextureTouch: High-fidelity 3D Haptic Shape Rendering on Handheld Virtual Reality Controllers , 2016, UIST.

[42]  Edward Lank,et al.  Concurrent bimanual stylus interaction: a study of non-preferred hand mode manipulation , 2006, Graphics Interface.

[43]  Xiang 'Anthony' Chen,et al.  The fat thumb: using the thumb's contact size for single-handed mobile interaction , 2012, Mobile HCI.

[44]  Fabrice Matulic,et al.  Sensing techniques for tablet+stylus interaction , 2014, UIST.

[45]  Edward Lank,et al.  A study on the scalability of non-preferred hand mode manipulation , 2007, ICMI '07.

[46]  Frederick P. Brooks,et al.  Moving objects in space: exploiting proprioception in virtual-environment interaction , 1997, SIGGRAPH.

[47]  Itiro Siio,et al.  Mobile interaction using paperweight metaphor , 2006, CHI EA '06.

[48]  Daniel Vogel,et al.  Conté: multimodal input inspired by an artist's crayon , 2011, UIST.

[49]  Tovi Grossman,et al.  The design and evaluation of selection techniques for 3D volumetric displays , 2006, UIST.

[50]  Jo W. Tombaugh,et al.  Measuring the true cost of command selection: techniques and results , 1990, CHI '90.

[51]  Giuseppe De Pietro,et al.  3D interaction with volumetric medical data: experiencing the Wiimote , 2008, Ambi-Sys '08.

[52]  Heedong Ko,et al.  "Move the couch where?" : developing an augmented reality multimodal interface , 2006, 2006 IEEE/ACM International Symposium on Mixed and Augmented Reality.