Backchannel Head Nods in Danish First Meeting Encounters with a Humanoid Robot: The Role of Physical Embodiment
暂无分享,去创建一个
[1] Costanza Navarretta,et al. Head movements, facial expressions and feedback in conversations: empirical evidence from Danish multimodal data , 2012, Journal on Multimodal User Interfaces.
[2] S. Maynard. Interactional functions of a nonverbal sign Head movement in japanese dyadic casual conversation , 1987 .
[3] A. Dittmann,et al. Relationship between vocalizations and head nods as listener responses. , 1968, Journal of personality and social psychology.
[4] Kiyoshi Yasuda,et al. Listener agent for elderly people with dementia , 2012, 2012 7th ACM/IEEE International Conference on Human-Robot Interaction (HRI).
[5] Candace L. Sidner,et al. Recognizing engagement in human-robot interaction , 2010, HRI 2010.
[6] Jens Allwood,et al. Repeated head movements, their function and relation to speech , 2010 .
[7] U. Hadar,et al. Head movement during listening turns in conversation , 1985 .
[8] R. Philipchalk,et al. The development of an abbreviated form of the revised Eysenck personality questionnaire (EPQR-A) : its use among students in England, Canada, the U.S.A. and Australia , 1992 .
[9] Evelyn Z. McClave. Linguistic functions of head movements in the context of speech , 2000 .
[10] Costanza Navarretta,et al. Feedback and gestural behaviour in a conversational corpus of Danish , 2011 .
[11] James W. Moore,et al. Institute of Electrical and Electronics Engineers (IEEE) , 2002 .
[12] Tomoko Koda,et al. Cultural Study on Speech Duration and Perception of Virtual Agent's Nodding , 2012, IVA.
[13] Hiroshi Ishiguro,et al. Generation of nodding, head tilting and eye gazing for human-robot dialogue interaction , 2012, 2012 7th ACM/IEEE International Conference on Human-Robot Interaction (HRI).
[14] Holly A. Yanco,et al. Proceedings of the seventh annual ACM/IEEE international conference on Human-Robot Interaction , 2012, HRI 2012.
[15] Subramanian Ramanathan,et al. Employing social gaze and speaking activity for automatic determination of the Extraversion trait , 2010, ICMI-MLMI '10.
[16] Costanza Navarretta,et al. Head Movements, Facial Expressions and Feedback in Danish First Encounters Interactions: A Culture-Specific Analysis , 2011, HCI.
[17] Johanna D. Moore,et al. Proceedings of the SIGDIAL 2009 Conference , 2009 .
[18] Peter Robinson,et al. When my robot smiles at me Enabling human-robot rapport via real-time head gesture mimicry , 2009 .
[19] Ryuichiro Higashinaka,et al. Analysis of Listening-Oriented Dialogue for Building Listening Agents , 2009, SIGDIAL Conference.
[20] Masato Kogure,et al. Nodding and smiling in silence during the loop sequence of backchannels in Japanese conversation , 2007 .
[21] J. Allwood,et al. A study of gestural feedback expressions , 2006 .
[22] Constantine Stephanidis,et al. Universal Access in Human-Computer Interaction , 2011 .
[23] Dana Kulic,et al. Measurement Instruments for the Anthropomorphism, Animacy, Likeability, Perceived Intelligence, and Perceived Safety of Robots , 2009, Int. J. Soc. Robotics.
[24] D. Heylen. Challenges ahead: head movements and other social acts during conversations , 2005 .
[25] Björn W. Schuller,et al. Building Autonomous Sensitive Artificial Listeners , 2012, IEEE Transactions on Affective Computing.