Human visual behaviour for collaborative human-machine interaction

Non-verbal behavioural cues are fundamental to human communication and interaction. Despite significant advances in recent years, state-of-the-art human-machine systems still fall short in sensing, analysing, and fully "understanding" cues naturally expressed in everyday settings. Two of the most important non-verbal cues, as evidenced by a large body of work in experimental psychology and behavioural sciences, are visual (gaze) behaviour and body language. We envision a new class of collaborative human-machine systems that fully exploit the information content available in non-verbal human behaviour in everyday settings through joint analysis of human gaze and physical behaviour.

[1]  Andreas Bulling,et al.  Pupil: an open source platform for pervasive eye tracking and mobile gaze-based interaction , 2014, UbiComp Adjunct.

[2]  Hans-Werner Gellersen,et al.  Cross-device gaze-supported point-to-point content transfer , 2014, ETRA.

[3]  Gerhard Tröster,et al.  Recognition of Hearing Needs from Body and Eye Movements to Improve Hearing Instruments , 2011, Pervasive.

[4]  Daniel Roggen,et al.  Recognition of visual memory recall processes using eye movement analysis , 2011, UbiComp '11.

[5]  Andreas Bulling,et al.  Cognition-Aware Computing , 2014, IEEE Pervasive Computing.

[6]  Mario Fritz,et al.  Prediction of search targets from fixations in open-world settings , 2015, 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[7]  Yanxia Zhang,et al.  SideWays: a gaze interface for spontaneous interaction with situated displays , 2013, CHI.

[8]  Yusuke Sugano,et al.  Self-Calibrating Head-Mounted Eye Trackers Using Egocentric Visual Saliency , 2015, UIST.

[9]  Hans-Werner Gellersen,et al.  Toward Mobile Eye-Based Human-Computer Interaction , 2010, IEEE Pervasive Computing.

[10]  Jörg Müller,et al.  Eye tracking for public displays in the wild , 2015, Personal and Ubiquitous Computing.

[11]  Andreas Bulling,et al.  EyeTab: model-based gaze estimation on unmodified tablet computers , 2014, ETRA.

[12]  Gerhard Tröster,et al.  Eye Movement Analysis for Activity Recognition Using Electrooculography , 2011, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[13]  Antonio Krüger,et al.  GazeProjector: Accurate Gaze Estimation and Seamless Gaze Interaction Across Multiple Displays , 2015, UIST.

[14]  Antonio Krüger,et al.  Analysing the potential of adapting head-mounted eye tracker calibration to a new user , 2012, ETRA '12.

[15]  Hans-Werner Gellersen,et al.  Pursuit calibration: making gaze calibration less tedious and more flexible , 2013, UIST.

[16]  Hans-Werner Gellersen,et al.  EyeContext: recognition of high-level contextual cues from human visual behaviour , 2013, CHI.

[17]  Mario Fritz,et al.  Appearance-based gaze estimation in the wild , 2015, 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[18]  Peter Robinson,et al.  Rendering of Eyes for Eye-Shape Registration and Gaze Estimation , 2015, 2015 IEEE International Conference on Computer Vision (ICCV).

[19]  Hans-Werner Gellersen,et al.  Pursuits: spontaneous interaction with displays based on smooth pursuit eye movement and moving targets , 2013, UbiComp.

[20]  Andreas Bulling,et al.  Recognition of curiosity using eye movement analysis , 2015, UbiComp/ISWC Adjunct.

[21]  Andreas Bulling,et al.  Discovery of everyday human activities from long-term visual behaviour using topic models , 2015, UbiComp.

[22]  Hans-Werner Gellersen,et al.  Multimodal recognition of reading activity in transit using body-worn sensors , 2012, TAP.

[23]  Hans-Werner Gellersen,et al.  The Royal Corgi: Exploring Social Gaze Interaction for Immersive Gameplay , 2015, CHI.

[24]  Hans-Werner Gellersen,et al.  Gaze+RST: Integrating Gaze and Multitouch for Remote Rotate-Scale-Translate Tasks , 2015, CHI.

[25]  Gerhard Tröster,et al.  Wearable EOG goggles: Seamless sensing and context-awareness in everyday environments , 2009, J. Ambient Intell. Smart Environ..

[26]  Mario Fritz,et al.  GazeDPM: Early Integration of Gaze Information in Deformable Part Models , 2015, ArXiv.