'Hop-to-select' traverse with gestural input in an eye-off interaction

We propose1 a way of leveraging people's familiarity with multitouch gestures, in interacting with displays which are not directly touchable or contactable. By defining a subset of multitouch gestures and implementing it on a smartwatch wirelessly connecting to a separate display screen, a user remains visually focused on the screen while continuing to interact with the system. We characterise this type of interaction and identify 'Hop-to-Select' traverse strategy. Usability testing with ten participants show that the applications are very easy to learn and use, and we report further insights in shaping novel usage scenarios from it.

[1]  Alireza Sahami Shirazi,et al.  Interaction techniques for creating and exchanging content with public displays , 2013, CHI.

[2]  Michael Rohs,et al.  Sweep and point and shoot: phonecam-based interactions for large public displays , 2005, CHI Extended Abstracts.

[3]  Michel Beaudouin-Lafon,et al.  Charade: remote control of objects using free-hand gestures , 1993, CACM.

[4]  Zhiwei Zhu,et al.  Eye and gaze tracking for interactive graphic display , 2002, SMARTGRAPH '02.

[5]  Masood Masoodian,et al.  Use of Video Shadow for Small Group Interaction Awareness on a Large Interactive Display Surface , 2003, AUIC.

[6]  Mark Billinghurst,et al.  Put that where? voice and gesture at the graphics interface , 1998, COMG.

[7]  Hideki Koike,et al.  3-D interaction with a large wall display using transparent markers , 2010, AVI.

[8]  Suguru Higashino,et al.  C-blink: a hue-difference-based light signal marker for large screen interaction via any mobile terminal , 2004, UIST '04.

[9]  Anthony Collins,et al.  Investigating intuitiveness and effectiveness of gestures for free spatial interaction with large displays , 2012, PerDis '12.

[10]  Dominik Schmidt,et al.  Eye Pull, Eye Push: Moving Objects between Large Screens and Personal Devices with Gaze and Touch , 2013, INTERACT.

[11]  Keith Cheverst,et al.  Exploring bluetooth based mobile phone interaction with the hermes photo display , 2005, Mobile HCI.

[12]  Daniel Vogel,et al.  Distant freehand pointing and clicking on very large, high resolution displays , 2005, UIST.

[13]  Moira C. Norrie,et al.  PresiShare: opportunistic sharing and presentation of content using public displays and QR codes , 2013, PerDis '13.

[14]  Richard A. Bolt,et al.  Gaze-orchestrated dynamic windows , 1981, SIGGRAPH '81.

[15]  Oleg Spakov,et al.  Disambiguating ninja cursors with eye gaze , 2009, CHI.

[16]  Judy Kay,et al.  Skeletons and Silhouettes: Comparing User Representations at a Gesture-based Large Display , 2016, CHI.

[17]  Rajeev Sharma,et al.  Toward Natural Gesture/Speech Control of a Large Display , 2001, EHCI.

[18]  Richard A. Bolt,et al.  “Put-that-there”: Voice and gesture at the graphics interface , 1980, SIGGRAPH '80.