GroupGazer: A Tool to Compute the Gaze per Participant in Groups with integrated Calibration to Map the Gaze Online to a Screen or Beamer Projection

In this paper we present GroupGaze. It is a tool that can be used to calculate the gaze direction and the gaze position of whole groups. GroupGazer calculates the gaze direction of every single person in the image and allows to map these gaze vectors to a projection like a projector. In addition to the person-specific gaze direction, the person affiliation of each gaze vector is stored based on the position in the image. Also, it is possible to save the group attention after a calibration. The software is free to use and requires a simple webcam as well as an NVIDIA GPU and the operating system Windows or Linux. Link: https://es-cloud.cs.uni-tuebingen.de/d/8e2ab8c3fdd444e1a135/?p=%2FGroupGazer&mode=list

[1]  Wolfgang Fuhl,et al.  Tensor Normalization and Full Distribution Training , 2021, ArXiv.

[2]  W. Meehan,et al.  The Association between Baseline Eye Tracking Performance and Concussion Assessments in High School Football Players , 2021, Optometry and vision science : official publication of the American Academy of Optometry.

[3]  Wolfgang Fuhl,et al.  Maximum and Leaky Maximum Propagation , 2021, 2022 International Joint Conference on Neural Networks (IJCNN).

[4]  J. Teo,et al.  Medical image interpretation training with a low‐cost eye tracking and feedback system: A preliminary study , 2021, Healthcare Technology Letters.

[5]  Alisha R Pollastri,et al.  An Exploration and Critical Examination of How “Intelligent Classroom Technologies” Can Improve Specific Uses of Direct Student Behavior Observation Methods , 2021 .

[6]  M. Keenan,et al.  Investigating Gaze Behaviour of Children Diagnosed with Autism Spectrum Disorders in a Classroom Setting , 2021, Journal of Autism and Developmental Disorders.

[7]  Wolfgang Fuhl,et al.  1000 Pupil Segmentations in a Second using Haar Like Features and Statistical Learning , 2021, 2021 IEEE/CVF International Conference on Computer Vision Workshops (ICCVW).

[8]  Gjergji Kasneci,et al.  TEyeD: Over 20 Million Real-World Eye Images with Pupil, Eyelid, and Iris 2D and 3D Segmentations, 2D and 3D Landmarks, 3D Eyeball, Gaze Vector, and Eye Movement Types , 2021, 2021 IEEE International Symposium on Mixed and Augmented Reality (ISMAR).

[9]  Enkelejda Kasneci,et al.  A Multimodal Eye Movement Dataset and a Multimodal Eye Movement Segmentation Analysis , 2021, ETRA Short Papers.

[10]  Enkelejda Kasneci,et al.  The Gaze and Mouse Signal as additional Source for User Fingerprints in Browser Applications , 2021, VISIGRAPP.

[11]  I. Krajbich,et al.  Webcam-based online eye-tracking for behavioral research , 2020, Judgment and Decision Making.

[12]  Enkelejda Kasneci,et al.  Weight and Gradient Centralization in Deep Neural Networks , 2020, ICANN.

[13]  Enkelejda Kasneci,et al.  Explainable Online Validation of Machine Learning Models for Practical Applications , 2020, 2020 25th International Conference on Pattern Recognition (ICPR).

[14]  Enkelejda Kasneci,et al.  Rotated Ring, Radial and Depth Wise Separable Radial Convolutions , 2020, 2021 International Joint Conference on Neural Networks (IJCNN).

[15]  Kun Chang Lee,et al.  An eye-tracking paradigm to explore the effect of online consumers’ emotion on their visual behaviour between desktop screen and mobile screen , 2020, Behav. Inf. Technol..

[16]  H. Jarodzka,et al.  Eye-Tracking in Educational Practice: Investigating Visual Perception Underlying Teaching and Learning in the Classroom , 2020, Educational Psychology Review.

[17]  Wolfgang Fuhl,et al.  From perception to action using observed actions to learn gestures , 2020, User Modeling and User-Adapted Interaction.

[18]  Vesna Popovic,et al.  Novice to Expert Real-time Knowledge Transition in the Context of X-ray Airport Security , 2020, DRS2020: Synergy.

[19]  J. Reneker,et al.  Virtual immersive sensorimotor training (VIST) in collegiate soccer athletes: A quasi-experimental study , 2020, Heliyon.

[20]  Enkelejda Kasneci,et al.  Multi Layer Neural Networks as Replacement for Pooling Operations , 2020, ArXiv.

[21]  Enkelejda Kasneci,et al.  Neural networks for optical vector and eye ball parameter estimation , 2020, ETRA Short Papers.

[22]  Enkelejda Kasneci,et al.  Tiny convolution, decision tree, and binary neuronal networks for robust and real time pupil outline estimation , 2020, ETRA Short Papers.

[23]  Jens Hainmueller,et al.  Using Eye-Tracking to Understand Decision-Making in Conjoint Experiments , 2020, Political Analysis.

[24]  Florian Alt,et al.  The Role of Eye Gaze in Security and Privacy Applications: Survey and Future HCI Research Directions , 2020, CHI.

[25]  D. Gijbels,et al.  It is all in the surv-eye: can eye tracking data shed light on the internal consistency in self-report questionnaires on cognitive processing strategies? , 2020 .

[26]  Yoichi Sato,et al.  Gaze Estimation by Exploring Two-Eye Asymmetry , 2020, IEEE Transactions on Image Processing.

[27]  Wolfgang Fuhl,et al.  Fully Convolutional Neural Networks for Raw Eye Tracking Data Segmentation, Generation, and Reconstruction , 2020, 2020 25th International Conference on Pattern Recognition (ICPR).

[28]  Enkelejda Kasneci,et al.  Reinforcement Learning for the Privacy Preservation and Manipulation of Eye Tracking Data , 2020, ICANN.

[29]  Pierre Maurage,et al.  Eye tracking correlates of acute alcohol consumption: A systematic and critical review , 2020, Neuroscience & Biobehavioral Reviews.

[30]  A. Mühlberger,et al.  Gaze Behavior in Social Fear Conditioning: An Eye-Tracking Study in Virtual Reality , 2020, Frontiers in Psychology.

[31]  Roland Brünken,et al.  Should learners use their hands for learning? Results from an eye-tracking study , 2019, J. Comput. Assist. Learn..

[32]  Wojciech Matusik,et al.  Gaze360: Physically Unconstrained Gaze Estimation in the Wild , 2019, 2019 IEEE/CVF International Conference on Computer Vision (ICCV).

[33]  Wolfgang Rosenstiel,et al.  The Applicability of Cycle GANs for Pupil and Eyelid Segmentation, Data Generation and Image Refinement , 2019, 2019 IEEE/CVF International Conference on Computer Vision Workshop (ICCVW).

[34]  Ilona Heldal,et al.  Detecting oculomotor problems using eye tracking: Comparing EyeX and TX300 , 2019, 2019 10th IEEE International Conference on Cognitive Infocommunications (CogInfoCom).

[35]  Wolfgang Rosenstiel,et al.  500, 000 Images Closer to Eyelid and Pupil Segmentation , 2019, CAIP.

[36]  Thiago Santini,et al.  Encodji: encoding gaze data into emoji space for an amusing scanpath classification approach ;) , 2019, ETRA.

[37]  Wolfgang Rosenstiel,et al.  Ferns for area of interest free scanpath classification , 2019, ETRA.

[38]  Nigel Bosch,et al.  Automated gaze-based mind wandering detection during computerized learning in classrooms , 2019, User Modeling and User-Adapted Interaction.

[39]  Wolfgang Rosenstiel,et al.  Training decision trees as replacement for convolution layers , 2019, AAAI.

[40]  Azhar Quddus,et al.  Non-Intrusive Detection of Drowsy Driving Based on Eye Tracking Data , 2019, Transportation Research Record: Journal of the Transportation Research Board.

[41]  J. Erichsen,et al.  The potential and value of objective eye tracking in the ophthalmology clinic , 2019, Eye.

[42]  Wolfgang Fuhl,et al.  Image-based extraction of eye features for robust eye tracking , 2019 .

[43]  Ran He,et al.  PyramidBox++: High Performance Detector for Finding Tiny Face , 2019, ArXiv.

[44]  C. Willemse,et al.  In natural interaction with embodied robots, we prefer it when they follow our gaze: a gaze-contingent mobile eyetracking study , 2018, Philosophical Transactions of the Royal Society B.

[45]  Qiao Wang,et al.  A Deep, Information-theoretic Framework for Robust Biometric Recognition , 2019, ArXiv.

[46]  Wolfgang Fuhl,et al.  Learning to validate the quality of detected landmarks , 2019, International Conference on Machine Vision.

[47]  Yan Wang,et al.  Robust Face Detection via Learning Small Faces on Hard Images , 2018, 2020 IEEE Winter Conference on Applications of Computer Vision (WACV).

[48]  Kavin Kathiresh Vijayan,et al.  Eye Tracker as a Tool for Engineering Education. , 2018 .

[49]  Enkelejda Kasneci,et al.  Rule-based learning for eye movement type detection , 2018, MCPMD@ICMI.

[50]  Enkelejda Kasneci,et al.  Histogram of oriented velocities for eye movement detection , 2018, MCPMD@ICMI.

[51]  Wolfgang Rosenstiel,et al.  MAM: Transfer Learning for Fully Automatic Video Annotation and Specialized Detector Creation , 2018, ECCV Workshops.

[52]  Enkelejda Kasneci,et al.  Eye movement velocity and gaze data generator for evaluation, robustness testing and assess of eye tracking software and visualization tools , 2018, ArXiv.

[53]  Matthias Zwicker,et al.  Kernel Foveated Rendering , 2018, PACMCGIT.

[54]  Jacob Whitehill,et al.  Who are they looking at? Automatic Eye Gaze Following for Classroom Observation Video Analysis , 2018, EDM.

[55]  Wolfgang Rosenstiel,et al.  Region of interest generation algorithms for eye tracking data , 2018, ETVIS@ETRA.

[56]  Wolfgang Rosenstiel,et al.  BORE: boosted-oriented edge optimization for robust, real time remote pupil center detection , 2018, ETRA.

[57]  Wolfgang Rosenstiel,et al.  CBF: circular binary features for robust and real-time pupil center detection , 2018, ETRA.

[58]  Manfredo Atzori,et al.  Head-mounted eye gaze tracking devices: An overview of modern devices and recent advances , 2018, Journal of rehabilitation and assistive technologies engineering.

[59]  Bernard Ghanem,et al.  Finding Tiny Faces in the Wild with Generative Adversarial Network , 2018, 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition.

[60]  Wolfgang Rosenstiel,et al.  Eye movement simulation and detector creation to reduce laborious parameter adjustments , 2018, ArXiv.

[61]  Mark Sandler,et al.  MobileNetV2: Inverted Residuals and Linear Bottlenecks , 2018, 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition.

[62]  Tom Foulsham,et al.  Scanpath analysis of expertise and culture in teacher gaze in real-world classrooms , 2018 .

[63]  Thiago Santini,et al.  Fast camera focus estimation for gaze-based focus control , 2017, ArXiv.

[64]  Wolfgang Rosenstiel,et al.  PupilNet v2.0: Convolutional Neural Networks for CPU based real time Robust Pupil Detection , 2017, ArXiv.

[65]  E. Hossner,et al.  Eye-Tracking Technology and the Dynamics of Natural Gaze Behavior in Sports: A Systematic Review of 40 Years of Research , 2017, Front. Psychol..

[66]  Antonio Torralba,et al.  Following Gaze in Video , 2017, 2017 IEEE International Conference on Computer Vision (ICCV).

[67]  Sidney K. D'Mello,et al.  "Out of the Fr-Eye-ing Pan": Towards Gaze-Based Models of Attention during Learning with Technology in the Classroom , 2017, UMAP.

[68]  Thomas C. Kübler,et al.  Ways of improving the precision of eye tracking data: Controlling the influence of dirt and dust on pupil detection , 2017, Journal of eye movement research.

[69]  Bo Chen,et al.  MobileNets: Efficient Convolutional Neural Networks for Mobile Vision Applications , 2017, ArXiv.

[70]  Thiago Santini,et al.  Fast and Robust Eyelid Outline and Aperture Detection in Real-World Scenarios , 2017, 2017 IEEE Winter Conference on Applications of Computer Vision (WACV).

[71]  Thiago Santini,et al.  Non-intrusive practitioner pupil detection for unmodified microscope oculars , 2016, Comput. Biol. Medicine.

[72]  C. Pintavirooj,et al.  Smart wheelchair based on eye tracking , 2016, 2016 9th Biomedical Engineering International Conference (BMEiCON).

[73]  A. Bulling,et al.  Pupil detection for head-mounted eye tracking in the wild: an evaluation of the state of the art , 2016, Machine Vision and Applications.

[74]  Giulio Sandini,et al.  Robot reading human gaze: Why eye tracking is better than head tracking for human-robot collaboration , 2016, 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[75]  Wolfgang Rosenstiel,et al.  Evaluation of state-of-the-art pupil detection algorithms on remote eye images , 2016, UbiComp Adjunct.

[76]  Wolfgang Rosenstiel,et al.  Eyes wide open? eyelid location and eye aperture estimation for pervasive eye tracking in real-world scenarios , 2016, UbiComp Adjunct.

[77]  Wojciech Matusik,et al.  Eye Tracking for Everyone , 2016, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[78]  Huaizu Jiang,et al.  Face Detection with the Faster R-CNN , 2016, 2017 12th IEEE International Conference on Automatic Face & Gesture Recognition (FG 2017).

[79]  Yu Qiao,et al.  Joint Face Detection and Alignment Using Multitask Cascaded Convolutional Networks , 2016, IEEE Signal Processing Letters.

[80]  Gjergji Kasneci,et al.  PupilNet: Convolutional Neural Networks for Robust Pupil Detection , 2016, ArXiv.

[81]  Jian Sun,et al.  Deep Residual Learning for Image Recognition , 2015, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[82]  Wei Liu,et al.  SSD: Single Shot MultiBox Detector , 2015, ECCV.

[83]  Antonio Torralba,et al.  Where are they looking? , 2015, NIPS.

[84]  Thiago Santini,et al.  ElSe: ellipse selection for robust pupil detection in real-world environments , 2015, ETRA.

[85]  Neil Martin Robertson,et al.  Deep Head Pose: Gaze-Direction Estimation in Multimodal Video , 2015, IEEE Transactions on Multimedia.

[86]  Wolfgang Rosenstiel,et al.  ExCuSe: Robust Pupil Detection in Real-World Scenarios , 2015, CAIP.

[87]  Steve Higham,et al.  The disengagement of visual attention in Alzheimer's disease: a longitudinal eye-tracking study , 2015, Front. Aging Neurosci..

[88]  Ali Farhadi,et al.  You Only Look Once: Unified, Real-Time Object Detection , 2015, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[89]  Kaiming He,et al.  Faster R-CNN: Towards Real-Time Object Detection with Region Proposal Networks , 2015, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[90]  Ross B. Girshick,et al.  Fast R-CNN , 2015, 1504.08083.

[91]  Mario Fritz,et al.  Appearance-based gaze estimation in the wild , 2015, 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[92]  Davis E. King Max-Margin Object Detection , 2015, ArXiv.

[93]  Arthur C. Graesser,et al.  To Quit or Not to Quit: Predicting Future Behavioral Disengagement from Reading Patterns , 2014, Intelligent Tutoring Systems.

[94]  Junjie Yan,et al.  The Fastest Deformable Part Model for Object Detection , 2014, 2014 IEEE Conference on Computer Vision and Pattern Recognition.

[95]  Zhengyou Zhang,et al.  Improving multiview face detection with multi-task deep convolutional neural networks , 2014, IEEE Winter Conference on Applications of Computer Vision.

[96]  Takahiro Okabe,et al.  Adaptive Linear Regression for Appearance-Based Gaze Estimation , 2014, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[97]  Andrew Zisserman,et al.  Detecting People Looking at Each Other in Videos , 2014, International Journal of Computer Vision.

[98]  Trevor Darrell,et al.  Rich Feature Hierarchies for Accurate Object Detection and Semantic Segmentation , 2013, 2014 IEEE Conference on Computer Vision and Pattern Recognition.

[99]  Yoichi Sato,et al.  Appearance-Based Gaze Estimation Using Visual Saliency , 2013, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[100]  James M. Rehg,et al.  Social interactions: A first-person perspective , 2012, 2012 IEEE Conference on Computer Vision and Pattern Recognition.

[101]  R. Hanajima,et al.  Where Do Neurologists Look When Viewing Brain CT Images? An Eye-Tracking Study Involving Stroke Cases , 2011, PloS one.

[102]  Mihaela Cocea,et al.  Disengagement Detection in Online Learning: Validation Studies and Perspectives , 2011, IEEE Transactions on Learning Technologies.

[103]  D. Crafford,et al.  Sport vision assessment in soccer players , 2010 .

[104]  Davis E. King,et al.  Dlib-ml: A Machine Learning Toolkit , 2009, J. Mach. Learn. Res..

[105]  Frédo Durand,et al.  Learning to predict where humans look , 2009, 2009 IEEE 12th International Conference on Computer Vision.

[106]  J. Smallwood,et al.  When attention matters: The curious incident of the wandering mind , 2008, Memory & cognition.

[107]  Lieven Verschaffel,et al.  A validation of eye movements as a measure of elementary school children's developing number sense , 2008 .

[108]  S. Blakemore,et al.  The application of eye‐tracking technology in the study of autism , 2007, The Journal of physiology.

[109]  Kursat Cagiltay,et al.  Studying computer game learning experience through eye tracking , 2007, Br. J. Educ. Technol..

[110]  Andrew Blake,et al.  Sparse and Semi-supervised Visual Mapping with the S^3GP , 2006, 2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'06).

[111]  David Manning,et al.  Eye-tracking AFROC study of the influence of experience and training on chest x-ray interpretation , 2003, SPIE Medical Imaging.

[112]  Narendra Ahuja,et al.  Appearance-based eye gaze estimation , 2002, Sixth IEEE Workshop on Applications of Computer Vision, 2002. (WACV 2002). Proceedings..

[113]  Paul A. Viola,et al.  Robust Real-Time Face Detection , 2001, Proceedings Eighth IEEE International Conference on Computer Vision. ICCV 2001.

[114]  I. Robertson,et al.  `Oops!': Performance correlates of everyday attentional failures in traumatic brain injured and normal subjects , 1997, Neuropsychologia.

[115]  Shumeet Baluja,et al.  Non-Intrusive Gaze Tracking Using Artificial Neural Networks , 1993, NIPS.

[116]  Mohamed H. Abdelpakey,et al.  DP-Siam: Dynamic Policy Siamese Network for Robust Object Tracking , 2020, IEEE Transactions on Image Processing.

[117]  Thiago Santini,et al.  Automatic Generation of Saliency-based Areas of Interest for the Visualization and Analysis of Eye-tracking Data , 2018, VMV.

[118]  Thiago Santini,et al.  EyeLad: Remote Eye Tracking Image Labeling Tool - Supportive Eye, Eyelid and Pupil Labeling Tool for Remote Eye Tracking Videos , 2017, VISIGRAPP.

[119]  Meia Chita-Tegmark,et al.  Social attention in ASD: A review and meta-analysis of eye-tracking studies. , 2016, Research in developmental disabilities.