Measuring engagement elicited by eye contact in Human-Robot Interaction
暂无分享,去创建一个
[1] Kostas Karpouzis,et al. Investigating shared attention with a virtual agent using a gaze-based interface , 2010, Journal on Multimodal User Interfaces.
[2] M. Tomasello,et al. Joint attention and early language. , 1986, Child development.
[3] V. Bruce,et al. Do the eyes have it? Cues to the direction of social attention , 2000, Trends in Cognitive Sciences.
[4] Chloé Clavel,et al. UE-HRI: a new dataset for the study of user engagement in spontaneous human-robot interactions , 2017, ICMI.
[5] A. Gulsrud,et al. Randomized Controlled Caregiver Mediated Joint Engagement Intervention for Toddlers with Autism , 2010, Journal of autism and developmental disorders.
[6] S. Gallagher,et al. Joint attention in joint action , 2013 .
[7] Sebastiaan Mathôt,et al. PyGaze: An open-source, cross-platform toolbox for minimal-effort programming of eyetracking experiments , 2014, Behavior research methods.
[8] Kyveli Kompatsiari,et al. The Importance of Mutual Gaze in Human-Robot Interaction , 2017, ICSR.
[9] G. Baird,et al. Testing joint attention, imitation, and play as infancy precursors to language and theory of mind , 2000 .
[10] Giulio Sandini,et al. The iCub humanoid robot: An open-systems platform for research in cognitive development , 2010, Neural Networks.