Gaze-Shifting: Direct-Indirect Input with Pen and Touch Modulated by Gaze

Modalities such as pen and touch are associated with direct input but can also be used for indirect input. We propose to combine the two modes for direct-indirect input modulated by gaze. We introduce gaze-shifting as a novel mechanism for switching the input mode based on the alignment of manual input and the user's visual attention. Input in the user's area of attention results in direct manipulation whereas input offset from the user's gaze is redirected to the visual target. The technique is generic and can be used in the same manner with different input modalities. We show how gaze-shifting enables novel direct-indirect techniques with pen, touch, and combinations of pen and touch input.

[1]  Dominik Schmidt,et al.  A Comparison of Direct and Indirect Multi-touch Input for Large Surfaces , 2009, INTERACT.

[2]  Hans-Werner Gellersen,et al.  Gaze+RST: Integrating Gaze and Multitouch for Remote Rotate-Scale-Translate Tasks , 2015, CHI.

[3]  Daniel Vogel,et al.  Shift: a technique for operating pen-based interfaces using touch , 2007, CHI.

[4]  William Buxton,et al.  Pen + touch = new tools , 2010, UIST.

[5]  Takeo Igarashi,et al.  Ninja cursors: using multiple cursors to assist target acquisition on large screens , 2008, CHI.

[6]  Raimund Dachselt,et al.  Look & touch: gaze-supported target acquisition , 2012, CHI.

[7]  Daniel J. Wigdor,et al.  Direct-touch vs. mouse input for tabletop displays , 2007, CHI.

[8]  Daniel J. Wigdor,et al.  Rock & rails: extending multi-touch interactions with shape gestures to enable precise spatial manipulations , 2011, CHI.

[9]  Roel Vertegaal,et al.  LookPoint: an evaluation of eye input for hands-free switching of input devices between multiple computers , 2006, OZCHI '06.

[10]  Shumin Zhai,et al.  Manual and gaze input cascaded (MAGIC) pointing , 1999, CHI '99.

[11]  Patrick Baudisch,et al.  Hover widgets: using the tracking state to extend the capabilities of pen-operated devices , 2006, CHI.

[12]  Arthur D. Fisk,et al.  Using direct and indirect input devices: Attention demands and age-related differences , 2009, TCHI.

[13]  Dominik Schmidt,et al.  Eye Pull, Eye Push: Moving Objects between Large Screens and Personal Devices with Gaze and Touch , 2013, INTERACT.

[14]  Eric Saund,et al.  Stylus input and editing without prior selection of mode , 2003, UIST '03.

[15]  Ravin Balakrishnan,et al.  Evaluating tactile feedback and direct vs. indirect stylus input in pointing and crossing selection tasks , 2008, CHI.

[16]  Andrew Sears and Julie A. Jacko The human-computer interaction handbook , 2013 .

[17]  Hongbin Zha,et al.  Improving eye cursor's stability for eye pointing tasks , 2008, CHI.

[18]  Patrick Baudisch,et al.  Precise selection techniques for multi-touch screens , 2006, CHI.

[19]  Raimund Dachselt,et al.  Still looking: investigating seamless gaze-supported selection, positioning, and manipulation of distant targets , 2013, CHI.

[20]  Andrea Bunt,et al.  A-coord input: coordinating auxiliary input streams for augmenting contextual pen-based interactions , 2012, CHI.

[21]  Florian Block,et al.  Touch-display keyboards: transforming keyboards into interactive surfaces , 2010, CHI.

[22]  Hans-Werner Gellersen,et al.  Cross-device gaze-supported point-to-point content transfer , 2014, ETRA.

[23]  Xiaojun Bi,et al.  An exploration of pen rolling for pen-based interaction , 2008, UIST '08.

[24]  Pourang Irani,et al.  ARC-Pad: absolute+relative cursor positioning for large displays with a mobile touchscreen , 2009, UIST '09.

[25]  Ji-Hyung Park,et al.  I-Grabber: expanding physical reach in a large-display tabletop environment through the use of a virtual grabber , 2009, ITS '09.

[26]  Fabrice Matulic,et al.  Sensing techniques for tablet+stylus interaction , 2014, UIST.

[27]  Patrick Baudisch,et al.  Separability of spatial manipulations in multi-touch interfaces , 2009, Graphics Interface.

[28]  Shumin Zhai,et al.  High precision touch screen interaction , 2003, CHI '03.

[29]  Xiang Cao,et al.  LensMouse: augmenting the mouse with an interactive touch display , 2010, CHI.

[30]  Hans-Werner Gellersen,et al.  Gaze-touch: combining gaze with multi-touch for interaction on the same surface , 2014, UIST.

[31]  Daniel Vogel,et al.  HybridPointing: fluid switching between absolute and relative pointing with a direct input device , 2006, UIST.

[32]  Roel Vertegaal,et al.  Pointable: an in-air pointing technique to manipulate out-of-reach targets on tabletops , 2011, ITS '11.

[33]  Michael Ortega-Binderberger,et al.  Rake cursor: improving pointing performance with concurrent input channels , 2009, CHI.

[34]  Thomas P. Moran,et al.  Pen-based interaction techniques for organizing material on an electronic whiteboard , 1997, UIST '97.

[35]  Anastasia Bezerianos,et al.  The vacuum: facilitating the manipulation of distant objects , 2005, CHI.

[36]  Y. Guiard Asymmetric division of labor in human skilled bimanual action: the kinematic chain as a model. , 1987, Journal of motor behavior.

[37]  J. Preece,et al.  The Human-Computer Interaction Handbook , 2003 .

[38]  Yang Li,et al.  Experimental analysis of mode switching techniques in pen-based user interfaces , 2005, CHI.

[39]  Eduardo Salas,et al.  The Human Computer Team: Interaction or Interference? , 1999 .

[40]  Regan L. Mandryk,et al.  TractorBeam: seamless integration of local and remote pointing for tabletop displays , 2005, Graphics Interface.

[41]  Daniel J. Wigdor,et al.  Combining and measuring the benefits of bimanual pen and direct-touch interaction on horizontal interfaces , 2008, AVI '08.