Inferring user action with mobile gaze tracking

Gaze tracking in psychological, cognitive, and user interaction studies has recently evolved toward mobile solutions, as they enable direct assessing of users' visual attention in natural environments, and augmented and virtual reality (AR/VR) applications. Productive approaches in analyzing and predicting user actions with gaze data require a multidisciplinary approach with experts in cognitive and behavioral sciences, machine vision, and machine learning. This workshop brings together a cross-domain group of individuals to (i) discuss and contribute to the problem of using mobile gaze tracking for inferring user action, (ii) advance the sharing of data and analysis algorithms as well as device solutions, and (iii) increase understanding of behavioral aspects of gaze-action sequences in natural environments and AR/VR applications.