Gaze tracking in psychological, cognitive, and user interaction studies has recently evolved toward mobile solutions, as they enable direct assessing of users' visual attention in natural environments, and augmented and virtual reality (AR/VR) applications. Productive approaches in analyzing and predicting user actions with gaze data require a multidisciplinary approach with experts in cognitive and behavioral sciences, machine vision, and machine learning. This workshop brings together a cross-domain group of individuals to (i) discuss and contribute to the problem of using mobile gaze tracking for inferring user action, (ii) advance the sharing of data and analysis algorithms as well as device solutions, and (iii) increase understanding of behavioral aspects of gaze-action sequences in natural environments and AR/VR applications.
[1]
Takeo Kanade,et al.
Illumination-free gaze estimation method for first-person vision wearable device
,
2011,
2011 IEEE International Conference on Computer Vision Workshops (ICCV Workshops).
[2]
James M. Rehg,et al.
Learning to Recognize Daily Actions Using Gaze
,
2012,
ECCV.
[3]
Sharman Jagadeesan,et al.
OMG!: a new robust, wearable and affordable open source mobile gaze tracker
,
2013,
MobileHCI '13.
[4]
Chen Yu,et al.
ExpertEyes: Open-source, high-definition eyetracking
,
2015,
Behavior research methods.
[5]
Andreas Bulling,et al.
Pupil: an open source platform for pervasive eye tracking and mobile gaze-based interaction
,
2014,
UbiComp Adjunct.