Pointing task on smart glasses: Comparison of four interaction techniques

Mobile devices such as smartphones, smartwatches or smart glasses have revolutionized how we interact. We are interested in smart glasses because they have the advantage of providing a simultaneous view of both physical and digital worlds. Despite this potential, pointing task on smart glasses is not really widespread. In this paper, we compared four interaction techniques for selecting targets : (a) the Absolute Head Movement and (b) the Relative Head Movement, where head movement controls the cursor on smart glasses in absolute or relative way, (c) the Absolute Free Hand interaction, where the forefinger control the cursor and (d) the Tactile Surface interaction, where the user controls the cursor via a small touchpad connected to smart glasses. We conducted an experiment with 18 participants. The Tactile Surface and Absolute Head Movement were the most efficient. The Relative Head Movement and Absolute Free Hand interactions were promising and require more exploration for other tasks.

[1]  Ying-Chao Tung,et al.  User-Defined Game Input for Smart Glasses in Public Space , 2015, CHI.

[2]  Lixin Fan,et al.  Head-tracking virtual 3-D display for mobile devices , 2012, 2012 IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops.

[3]  Chiuhsiang Joe Lin,et al.  An investigation of pointing postures in a 3D stereoscopic environment. , 2015, Applied ergonomics.

[4]  Ben Shneiderman,et al.  LifeLines: using visualization to enhance navigation and analysis of patient records , 1998, AMIA.

[5]  Tobias Höllerer,et al.  Vision-based interfaces for mobility , 2004, The First Annual International Conference on Mobile and Ubiquitous Systems: Networking and Services, 2004. MOBIQUITOUS 2004..

[6]  A. Y. C. Nee,et al.  Wearable interface for the physical disabled , 2007, i-CREATe '07.

[7]  Marina Dabic,et al.  Exploring adoption of smart glasses: Applications in medical industry , 2016, 2016 Portland International Conference on Management of Engineering and Technology (PICMET).

[8]  Cristina Manresa-Yee,et al.  Head-tracking interfaces on mobile devices: Evaluation using Fitts' law and a new multi-directional corner task for small displays , 2018, Int. J. Hum. Comput. Stud..

[9]  Thomas Pederson,et al.  MAGIC pointing for eyewear computers , 2015, SEMWEB.

[10]  Ravin Balakrishnan,et al.  Reaching for objects in VR displays: lag and frame rate , 1994, TCHI.

[11]  Li-Wei Chan,et al.  CyclopsRing: Enabling Whole-Hand and Context-Aware Interactions Through a Fisheye Ring , 2015, UIST.

[12]  Robert J. Teather,et al.  Effects of tracking technology, latency, and spatial jitter on object movement , 2009, 2009 IEEE Symposium on 3D User Interfaces.

[13]  Andrew Georgiou,et al.  Review Paper: The Impact of Mobile Handheld Technology on Hospital Physicians' Work Practices and Patient Care: A Systematic Review , 2009, J. Am. Medical Informatics Assoc..

[14]  Steven K. Feiner,et al.  WeARHand: Head-worn, RGB-D camera-based, bare-hand user interface with visually enhanced depth perception , 2014, 2014 IEEE International Symposium on Mixed and Augmented Reality (ISMAR).

[15]  R.H.Y. So,et al.  Sensory Motor Responses in Virtual Environments: Studying the Effects of Image Latencies for Target-directed Hand Movement , 2005, 2005 IEEE Engineering in Medicine and Biology 27th Annual Conference.

[16]  M. Bergamasco,et al.  A navigation interface based on head tracking by accelerometers , 2004, RO-MAN 2004. 13th IEEE International Workshop on Robot and Human Interactive Communication (IEEE Catalog No.04TH8759).

[17]  Shumin Zhai,et al.  Characterizing computer input with Fitts' law parameters-the information and non-information aspects of pointing , 2004, Int. J. Hum. Comput. Stud..

[18]  A. T. Welford,et al.  The fundamentals of skill , 1968 .

[19]  Mark Billinghurst,et al.  Pinpointing: Precise Head- and Eye-Based Target Selection for Augmented Reality , 2018, CHI.

[20]  Oliver Amft,et al.  Using smart eyeglasses as a wearable game controller , 2015, UbiComp/ISWC Adjunct.

[21]  George D. C. Cavalcanti,et al.  Enhanced real-time head pose estimation system for mobile device , 2014, Integr. Comput. Aided Eng..

[22]  I. Scott MacKenzie,et al.  Accuracy measures for evaluating computer pointing devices , 2001, CHI.

[23]  Tae-Woong Yoo,et al.  Hand segmentation and fingertip detection for interfacing of stereo vision-based smart glasses , 2015, 2015 IEEE International Conference on Consumer Electronics (ICCE).

[24]  Patrick Baudisch,et al.  Imaginary interfaces: spatial interaction with empty hands and without visual feedback , 2010, UIST.

[25]  I. Scott MacKenzie,et al.  Lag as a determinant of human performance in interactive systems , 1993, INTERCHI.

[26]  Shanhe Yi,et al.  GlassGesture: Exploring head gesture interface of smart glasses , 2016, INFOCOM.

[27]  Rainer Malkewitz,et al.  Head pointing and speech control as a hands-free interface to desktop computing , 1998, Assets '98.

[28]  Eric D. Ragan,et al.  Physical hand interaction for controlling multiple virtual objects in virtual reality , 2018, IWISC.

[29]  Patrick Olivier,et al.  Digits: freehand 3D interactions anywhere using a wrist-worn gloveless sensor , 2012, UIST.

[30]  Karl Andersson,et al.  Manipulating Control-Display Ratios in Room-Scale Virtual Reality , 2017 .

[31]  P. Fitts The information capacity of the human motor system in controlling the amplitude of movement. , 1954, Journal of experimental psychology.

[32]  Eric Foxlin,et al.  Motion Tracking Requirements and Technologies , 2002 .

[33]  Pan Hui,et al.  Ubii: Towards Seamless Interaction between Digital and Physical Worlds , 2015, ACM Multimedia.

[34]  I. Scott MacKenzie,et al.  Towards a standard for pointing device evaluation, perspectives on 27 years of Fitts' law research in HCI , 2004, Int. J. Hum. Comput. Stud..

[35]  Robert J. Teather,et al.  The eyes don't have it: an empirical comparison of head-based and eye-based selection in virtual reality , 2017, SUI.

[36]  Maria Isabel Saludares,et al.  Interaction techniques using head gaze for virtual reality , 2016, 2016 IEEE Region 10 Symposium (TENSYMP).

[37]  Anthony Faiola,et al.  Advancing Critical Care in the ICU: A Human-Centered Biomedical Data Visualization Systems , 2011, HCI.

[38]  Vivek K. Goyal,et al.  Mime: compact, low power 3D gesture sensing for interaction with head mounted displays , 2013, UIST.

[39]  Thomas Engel,et al.  Not a tile out of place: Toward creating context-dependent user interfaces on smartglasses , 2016, 2016 9th International Conference on Human System Interactions (HSI).

[40]  Takashi Watanabe,et al.  Gyro-Mouse for the Disabled: 'Click' and 'Position' Control of the Mouse Cursor , 2007 .

[41]  I. MacKenzie,et al.  A note on the information-theoretic basis of Fitts' law. , 1989, Journal of motor behavior.

[42]  MacKenzie Is A Note on the Information-Theoretic Basis for Fitts’ Law , 1989 .

[43]  Ronald Azuma,et al.  A demonstrated optical tracker with scalable work area for head-mounted display systems , 1992, I3D '92.

[44]  Pan Hui,et al.  Interaction Methods for Smart Glasses: A Survey , 2017, IEEE Access.

[45]  Michael Rohs,et al.  ShoeSense: a new perspective on gestural interaction and wearable applications , 2012, CHI.

[46]  G. Appelboom,et al.  Clinical and surgical applications of smart glasses. , 2015, Technology and health care : official journal of the European Society for Engineering and Medicine.

[47]  Yuanchun Shi,et al.  Tap, Dwell or Gesture?: Exploring Head-Based Text Entry Techniques for HMDs , 2017, CHI.

[48]  Anthony Dunnigan,et al.  Exploring gestural interaction in smart spaces using head mounted devices with ego-centric sensing , 2014, SUI.

[49]  Dan Witzner Hansen,et al.  Head and Eye Movement as Pointing Modalities for Eyewear Computers , 2014, 2014 11th International Conference on Wearable and Implantable Body Sensor Networks Workshops.

[50]  Tobias Höllerer,et al.  Handy AR: Markerless Inspection of Augmented Reality Objects Using Fingertip Tracking , 2007, 2007 11th IEEE International Symposium on Wearable Computers.

[51]  Ana M. Bernardos,et al.  A Comparison of Head Pose and Deictic Pointing Interaction Methods for Smart Environments , 2016, Int. J. Hum. Comput. Interact..