Multimodal Gaze Controlled Dashboard

This paper explores use of eye gaze tracking as a direct controller of electronic displays inside a car and analysing drivers' cognitive load. A set of new multimodal fusion algorithms are proposed and validated involving eye gaze and finger tracking systems through ISO 26022 driving task. The algorithms are also evaluated inside a moving car. A set of user studies proposed to use velocities of Saccadic Intrusion as a mean of detecting drivers' cognitive load.

[1]  Patrick Langdon,et al.  Comparing Ocular Parameters for Cognitive Load Measurement in Eye-Gaze-Controlled Interfaces for Automotive and Desktop Computing Environments , 2016, Int. J. Hum. Comput. Interact..

[2]  Angela Castronovo,et al.  Generating a Personalized UI for the Car: A User-Adaptive Rendering Architecture , 2013, UMAP.

[3]  Wonil Hwang,et al.  Haptic Seat Interfaces for Driver Information and Warning Systems , 2011, Int. J. Hum. Comput. Interact..

[4]  Richard V Abadi,et al.  Characteristics of saccadic intrusions , 2004, Vision Research.

[5]  Sandra P Marshall,et al.  Identifying cognitive state from eye metrics. , 2007, Aviation, space, and environmental medicine.

[6]  Kyungdoh Kim,et al.  Utilization of Visual Information Perception Characteristics to Improve Classification Accuracy of Driver’s Visual Search Intention for Intelligent Vehicle , 2015, Int. J. Hum. Comput. Interact..

[7]  A. Kramer,et al.  Physiological metrics of mental workload: A review of recent progress , 1990, Multiple-task performance.

[8]  Patrick Langdon,et al.  Multimodal Intelligent Eye-Gaze Tracking System , 2015, Int. J. Hum. Comput. Interact..

[9]  Gerhard Rigoll,et al.  Gaze-based interaction on multiple displays in an automotive environment , 2011, 2011 IEEE International Conference on Systems, Man, and Cybernetics.

[10]  Albrecht Schmidt,et al.  Making use of drivers' glances onto the screen for explicit gaze-based interaction , 2010, AutomotiveUI.

[11]  Goro Obinata,et al.  Estimation of mental workload using saccadic eye movements in a free-viewing task , 2011, 2011 Annual International Conference of the IEEE Engineering in Medicine and Biology Society.

[12]  Mohan M. Trivedi,et al.  Hand Gesture Recognition in Real Time for Automotive Interfaces: A Multimodal Vision-Based Approach and Evaluations , 2014, IEEE Transactions on Intelligent Transportation Systems.

[13]  Carl Jörgen Normark,et al.  Design and Evaluation of a Touch-Based Personalizable In-Vehicle User Interface , 2015, Int. J. Hum. Comput. Interact..