From real-time attention assessment to “with-me-ness” in human-robot interaction
暂无分享,去创建一个
Pierre Dillenbourg | Séverin Lemaignan | Fernando García | Alexis Jacq | P. Dillenbourg | Séverin Lemaignan | Alexis Jacq | Fernando García
[1] A. L. Yarbus. Eye Movements During Perception of Complex Objects , 1967 .
[2] Kenneth Holmqvist,et al. Eye tracking: a comprehensive guide to methods and measures , 2011 .
[3] Ana Paiva,et al. Detecting user engagement with a robot companion using task and social interaction-based features , 2009, ICMI-MLMI '09.
[4] Sofiane Boucenna,et al. Evaluating the Engagement with Social Robots , 2015, International Journal of Social Robotics.
[5] V. L. Clark,et al. Clinical Methods: The History, Physical, and Laboratory Examinations , 1990 .
[6] Pierre Dillenbourg,et al. When Children Teach a Robot to Write: An Autonomous Teachable Humanoid Which Uses Simulated Handwriting , 2015, 2015 10th ACM/IEEE International Conference on Human-Robot Interaction (HRI).
[7] Rachid Alami,et al. Situation assessment for human-robot interactive object manipulation , 2011, 2011 RO-MAN.
[8] Patrick Jermann,et al. "With-Me-Ness": A Gaze-Measure for Students' Attention in MOOCs , 2014, ICLS.
[9] Ana Paiva,et al. Automatic analysis of affective postures and body motion to detect engagement with a game companion , 2011, 2011 6th ACM/IEEE International Conference on Human-Robot Interaction (HRI).
[10] M. Argyle. Social interactions. , 1976, Science.
[11] Mohamed Chetouani,et al. Towards Engagement Models that Consider Individual Factors in HRI: On the Relation of Extroversion and Negative Attitude Towards Robots to Gaze and Speech During a Human–Robot Assembly Task , 2015, Int. J. Soc. Robotics.
[12] Candace L. Sidner,et al. Recognizing engagement in human-robot interaction , 2010, HRI 2010.
[13] Rainer Stiefelhagen,et al. Tracking focus of attention in meetings , 2002, Proceedings. Fourth IEEE International Conference on Multimodal Interfaces.
[14] Kostas Karpouzis,et al. Investigating shared attention with a virtual agent using a gaze-based interface , 2010, Journal on Multimodal User Interfaces.
[15] Luc Van Gool,et al. Real time 3D head pose estimation: Recent achievements and future challenges , 2012, 2012 5th International Symposium on Communications, Control and Signal Processing.
[16] Tony Belpaeme,et al. Head Pose Estimation is an Inadequate Replacement for Eye Gaze in Child-Robot Interaction , 2015, HRI.
[17] Norman I. Badler,et al. A Review of Eye Gaze in Virtual Agents, Social Robotics and HCI: Behaviour Generation, User Interaction and Perception , 2015, Comput. Graph. Forum.
[18] J. R. Landis,et al. The measurement of observer agreement for categorical data. , 1977, Biometrics.
[19] Elaine Toms,et al. The development and evaluation of a survey to measure user engagement , 2010, J. Assoc. Inf. Sci. Technol..
[20] D. Legge,et al. Perception and Information. , 1976 .
[21] Davis E. King,et al. Dlib-ml: A Machine Learning Toolkit , 2009, J. Mach. Learn. Res..
[22] T. Nishida,et al. Combining Multiple Types of Eye-gaze Information to Predict User ’ s Conversational Engagement , 2011 .
[23] Tony Belpaeme,et al. Tracking Gaze over Time in HRI as a Proxy for Engagement and Attribution of Social Agency , 2014, 2014 9th ACM/IEEE International Conference on Human-Robot Interaction (HRI).
[24] Josephine Sullivan,et al. One millisecond face alignment with an ensemble of regression trees , 2014, 2014 IEEE Conference on Computer Vision and Pattern Recognition.
[25] Patrick Jermann,et al. Effects of sharing text selections on gaze cross-recurrence and interaction quality in a pair programming task , 2012, CSCW.