Evaluating Remote and Head-worn Eye Trackers in Multi-modal Speech-based HRI (Demo)
暂无分享,去创建一个
[1] Zenzi M. Griffin,et al. PSYCHOLOGICAL SCIENCE Research Article WHAT THE EYES SAY ABOUT SPEAKING , 2022 .
[2] Patrick Gebhard,et al. Exploring a Model of Gaze for Grounding in Multimodal HRI , 2014, ICMI.
[3] Andreas Bulling,et al. Prediction of gaze estimation error for error-aware gaze-based interfaces , 2016, ETRA.
[4] Daniel Sonntag,et al. Towards Gaze and Gesture Based Human-Robot Interaction for Dementia Patients , 2015, AAAI Fall Symposia.
[5] András Lörincz,et al. Personalization of Gaze Direction Estimation with Deep Learning , 2016, KI.
[6] Andreas Bulling,et al. Pupil: an open source platform for pervasive eye tracking and mobile gaze-based interaction , 2014, UbiComp Adjunct.
[7] Robert Neßelrath,et al. SiAM-dp: an open development platform for massively multimodal dialogue systems in cyber-physical environments , 2015 .
[8] M. Land,et al. The Roles of Vision and Eye Movements in the Control of Activities of Daily Living , 1998, Perception.
[9] Pierre Dillenbourg,et al. From real-time attention assessment to “with-me-ness” in human-robot interaction , 2016, 2016 11th ACM/IEEE International Conference on Human-Robot Interaction (HRI).