Generating Co-speech Gestures for the Humanoid Robot NAO through BML
暂无分享,去创建一个
[1] Michael Neff,et al. An annotation scheme for conversational gestures: how to economically capture timing and form , 2007, Lang. Resour. Evaluation.
[2] Stefan Kopp,et al. Towards a Common Framework for Multimodal Generation: The Behavior Markup Language , 2006, IVA.
[3] A. Kendon. Gesture: Visible Action as Utterance , 2004 .
[4] Catherine Pelachaud,et al. Modelling multimodal expression of emotion in a virtual agent , 2009, Philosophical Transactions of the Royal Society B: Biological Sciences.
[5] Stefan Kopp,et al. Generating robot gesture using a virtual agent framework , 2010, 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems.
[6] C. Creider. Hand and Mind: What Gestures Reveal about Thought , 1994 .
[7] Candace L. Sidner,et al. Recognizing engagement in human-robot interaction , 2010, HRI 2010.
[8] Maurizio Mancini,et al. Implementing Expressive Gesture Synthesis for Embodied Conversational Agents , 2005, Gesture Workshop.
[9] Pierre Blazevic,et al. Mechatronic design of NAO humanoid , 2009, 2009 IEEE International Conference on Robotics and Automation.
[10] Mitsuru Ishizuka,et al. Humanoid Robot Presentation through Multimodal Presentation Markup Language MPML-HR , 2005 .
[11] Pengcheng Luo,et al. Synchronized gesture and speech production for humanoid robots , 2010, 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems.
[12] Stefan Kopp,et al. Towards an integrated model of speech and gesture production for multi-modal robot behavior , 2010, 19th International Symposium in Robot and Human Interactive Communication.
[13] Bruno Maisonnier,et al. Choregraphe: a graphical tool for humanoid robot programming , 2009, RO-MAN 2009 - The 18th IEEE International Symposium on Robot and Human Interactive Communication.
[14] Nicolas Courty,et al. Gesture in Human-Computer Interaction and Simulation , 2006 .