Remote Gaze and Gesture Tracking on the Microsoft Kinect: Investigating the Role of Feedback

In this paper we present the results of a user experience and preference study into the combination of gaze and gesture in a lounge-style remote-interaction, using a novel system that tracks gaze and gesture using only the Kinect device at a distance of 2m from the user. Our results indicate exciting opportunities for gaze-tracking interfaces that use existing technologies, but suggest that findings from studies of highly-accurate gaze systems may not apply in these real-world simulations where the gaze-tracking is inherently less accurate. We contribute a series of design recommendations for gaze and gesture interfaces in this context, and based on these limitations.

[1]  Aude Billard,et al.  Calibration-Free Eye Gaze Direction Detection with Gaussian Processes , 2008, VISAPP.

[2]  Paul van Schaik,et al.  Using on-line surveys to measure three key constructs of the quality of human-computer interaction in web sites: psychometric properties and implications , 2003, Int. J. Hum. Comput. Stud..

[3]  Päivi Majaranta,et al.  Effects of feedback on eye typing with a short dwell time , 2004, ETRA.

[4]  Andrew T. Duchowski,et al.  Eye Tracking Methodology: Theory and Practice , 2003, Springer London.

[5]  Jörg Müller,et al.  StrikeAPose: revealing mid-air gestures on public displays , 2013, CHI.

[6]  Roel Vertegaal,et al.  Attentive User Interfaces , 2003 .

[7]  Katharina Seifert,et al.  Evaluation multimodaler Computer-Systeme in frühen Entwicklungsphasen , 2002 .

[8]  Richard A. Bolt,et al.  Gaze-orchestrated dynamic windows , 1981, SIGGRAPH '81.

[9]  Sebastian Maier,et al.  Eye gaze assisted human-computer interaction in a hand gesture controlled multi-display environment , 2012, Gaze-In '12.

[10]  C. Ware,et al.  An evaluation of an eye tracker as a device for computer input2 , 1987, CHI '87.

[11]  Hans-Werner Gellersen,et al.  Gaze+RST: Integrating Gaze and Multitouch for Remote Rotate-Scale-Translate Tasks , 2015, CHI.

[12]  Kristinn R. Thórisson,et al.  Integrating Simultaneous Input from Speech, Gaze, and Hand Gestures , 1991, AAAI Workshop on Intelligent Multimedia Interfaces.

[13]  Omar Mubin,et al.  How Not to Become a Buffoon in Front of a Shop Window: A Solution Allowing Natural Head Movement for Interaction with a Public Display , 2009, INTERACT.

[14]  Whoi-Yul Kim,et al.  Long range eye gaze tracking system for a large screen , 2012, IEEE Transactions on Consumer Electronics.

[15]  Thomas Apperley,et al.  The body of the gamer: game art and gestural excess , 2013, Digit. Creativity.

[16]  Dominik Schmidt,et al.  Eye Pull, Eye Push: Moving Objects between Large Screens and Personal Devices with Gaze and Touch , 2013, INTERACT.

[17]  A. L. Yarbus,et al.  Eye Movements and Vision , 1967, Springer US.

[18]  Andrew T. Duchowski,et al.  Efficient eye pointing with a fisheye lens , 2005, Graphics Interface.

[19]  Hans-Werner Gellersen,et al.  An Empirical Investigation of Gaze Selection in Mid-Air Gestural 3D Manipulation , 2015, INTERACT.

[20]  Aga Bojko,et al.  Eye Tracking the User Experience: A Practical Guide to Research , 2013 .

[21]  Kenton O'Hara,et al.  Touchless interaction in surgery , 2014, CACM.

[22]  Mitchell Harrop,et al.  Paradigms of games research in HCI: a review of 10 years of research at CHI , 2014, CHI PLAY.

[23]  I. Scott MacKenzie,et al.  Effects of feedback and dwell time on eye typing speed and accuracy , 2006, Universal Access in the Information Society.

[24]  Robert J. K. Jacob,et al.  Evaluation of eye gaze interaction , 2000, CHI.

[25]  Colin Ware,et al.  An evaluation of an eye tracker as a device for computer input2 , 1987, CHI 1987.

[26]  A. L. I︠A︡rbus Eye Movements and Vision , 1967 .

[27]  Antonio Krüger,et al.  GazeProjector: Location-independent Gaze Interaction on and Across Multiple Displays , 2015 .

[28]  Hongbin Zha,et al.  Effects of Different Visual Feedback Forms on Eye Cursor's Stabilities , 2011, HCI.

[29]  Yanxia Zhang,et al.  SideWays: a gaze interface for spontaneous interaction with situated displays , 2013, CHI.

[30]  Du-Sik Park,et al.  3D user interface combining gaze and hand gestures for large-scale display , 2010, CHI EA '10.

[31]  Robert J. K. Jacob,et al.  Eye Movement-Based Human-Computer Interaction Techniques: Toward Non-Command Interfaces , 2003 .

[32]  Sungmin Cho,et al.  GaFinC: Gaze and Finger Control interface for 3D model manipulation in CAD application , 2014, Comput. Aided Des..

[33]  Antonio Krüger,et al.  GazeProjector: Accurate Gaze Estimation and Seamless Gaze Interaction Across Multiple Displays , 2015, UIST.

[34]  Marcus Carter,et al.  Implicit and explicit interactions in video mediated collaboration , 2014, OZCHI.

[35]  Steve Howard,et al.  Audience experience in social videogaming: effects of turn expectation and game physicality , 2014, CHI.

[36]  Craig Hennessey,et al.  Long range eye tracking: bringing eye tracking into the living room , 2012, ETRA.

[37]  Pilar Orero,et al.  Aggregate gaze visualization with real-time heatmaps , 2012, ETRA.

[38]  Hans-Werner Gellersen,et al.  Arcade+: A Platform for Public Deployment and Evaluation of Multi-Modal Games , 2015, CHI PLAY.

[39]  Helena M. Mentis,et al.  The mocking gaze: the social organization of kinect use , 2013, CSCW.

[40]  Whoi-Yul Kim,et al.  Long-Range Gaze Tracking System for Large Movements , 2013, IEEE Transactions on Biomedical Engineering.

[41]  Rafael Cabeza,et al.  Evaluation of pupil center-eye corner vector for gaze estimation using a web cam , 2012, ETRA '12.

[42]  Antti Jylhä,et al.  Comparing eye and gesture pointing to drag items on large screens , 2013, ITS.

[43]  Päivi Majaranta,et al.  Eye Tracking and Eye-Based Human–Computer Interaction , 2014 .