Situational Context Directs How People Affectively Interpret Robotic Non-Linguistic Utterances
暂无分享,去创建一个
[1] Nicole C. Krämer,et al. An Experimental Study on Emotional Reactions Towards a Robot , 2013, Int. J. Soc. Robotics.
[2] Laurel D. Riek,et al. Wizard of Oz studies in HRI , 2012, J. Hum. Robot Interact..
[3] Bram Vanderborght,et al. EMOGIB: Emotional Gibberish Speech Database for Affective Human-Robot Interaction , 2011, ACII.
[4] Tony Belpaeme,et al. Interpreting non-linguistic utterances by robots: studying the influence of physical appearance , 2010, AFFINE '10.
[5] Crystal Chao,et al. Controlling social dynamics with a parametrized model of floor regulation , 2013, HRI 2013.
[6] Jun Hu,et al. Improving speech recognition with the robot interaction language , 2012 .
[7] Pierre-Yves Oudeyer,et al. The production and recognition of emotions in speech: features and algorithms , 2003, Int. J. Hum. Comput. Stud..
[8] Tony Belpaeme,et al. People Interpret Robotic Non-linguistic Utterances Categorically , 2013, 2013 8th ACM/IEEE International Conference on Human-Robot Interaction (HRI).
[9] Tony Belpaeme,et al. How to use non-linguistic utterances to convey emotion in child-robot interaction , 2012, 2012 7th ACM/IEEE International Conference on Human-Robot Interaction (HRI).
[10] Marek P. Michalowski,et al. Keepon : A Playful Robot for Research, Therapy, and Entertainment (Original Paper) , 2009 .
[11] Kerstin Dautenhahn,et al. Methodological Issues in HRI: A Comparison of Live and Video-Based Methods in Robot to Human Approach Direction Trials , 2006, ROMAN 2006 - The 15th IEEE International Symposium on Robot and Human Interactive Communication.
[12] Oudeyer Pierre-Yves,et al. The production and recognition of emotions in speech: features and algorithms , 2003 .