Gesture encoding and reproduction for human-robot interaction in text-to-gesture systems

– The purpose of this paper is to deal with a method for gesture encoding and reproduction, particularly aiming at a text‐to‐gesture (TTG) system that enables robotic agents to generate proper gestures automatically and naturally in human‐robot interaction., – Reproducing proper gestures, naturally synchronized with speech, is important under the TTG concept. The authors first introduce a gesture model that is effective to abstract and describe a variety of human gestures. Based on the model, a gesture encoding/decoding scheme is proposed to encode observed gestures symbolically and parametrically and to reproduce robot gestures from the codes. In particular, this paper mainly addresses a gesture scheduling method that deals with the alignment and refinement of gestural motions, in order to reproduce robotic gesticulation in a human‐like, natural fashion., – The proposed method has been evaluated through a series of questionnaire surveys, and it was found that reproduced gestures by a robotic agent could appeal satisfactorily to human beings., – This paper provides a series of algorithms to treat overlapped motions and to refine the timing parameters for the motions, so that robotic agents reproduce human‐like, natural gestures.

[1]  Pengcheng Luo,et al.  Synchronized gesture and speech production for humanoid robots , 2010, 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[2]  Z. Zenn Bien,et al.  A Steward Robot for Human-Friendly Human-Machine Interaction in a Smart House Environment , 2008, IEEE Transactions on Automation Science and Engineering.

[3]  Jan Peters,et al.  Imitation and Reinforcement Learning , 2010, IEEE Robotics & Automation Magazine.

[4]  Yasuhisa Hasegawa,et al.  Standing-up motion support for paraplegic patient with Robot Suit HAL , 2009, 2009 IEEE International Conference on Rehabilitation Robotics.

[5]  Norihiko Saga,et al.  Design and control of an upper limb rehabilitation support device for disabled people using a pneumatic cylinder , 2010, Ind. Robot.

[6]  J. Merlet Jacobian, Manipulability, Condition Number and Accuracy of Parallel Robots , 2005, ISRR.

[7]  Takashi Minato,et al.  Generating natural motion in an android by mapping human motion , 2005, 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[8]  Justine Cassell,et al.  BEAT: the Behavior Expression Animation Toolkit , 2001, Life-like characters.

[9]  Toyoaki Nishida,et al.  Enriching Agent Animations with Gestures and Highlighting Effects , 2004, IMTCI.

[10]  Huosheng Hu,et al.  Head gesture recognition for hands-free control of an intelligent wheelchair , 2007, Ind. Robot.

[11]  Ho-min Sohn,et al.  The Korean language , 1999 .

[12]  Z. Zenn Bien,et al.  Effective learning system techniques for human-robot interaction in service environment , 2007, Knowl. Based Syst..

[13]  Michael Kipp,et al.  Gesture generation by imitation: from human behavior to computer character animation , 2005 .

[14]  Isabella Poggi,et al.  From a Typology of Gestures to a Procedure for Gesture Production , 2001, Gesture Workshop.

[15]  Naoji Shiroma,et al.  ABLE: a standing style transfer system for a person with disabled lower limbs (improvement of stability when traveling) , 2011, Ind. Robot.

[16]  Jun-Hyeong Do,et al.  Intelligent Interaction For Human-Friendly Service Robot in Smart House Environment , 2008, Int. J. Comput. Intell. Syst..

[17]  C. Creider Hand and Mind: What Gestures Reveal about Thought , 1994 .

[18]  Z. Zenn Bien,et al.  Automatic Generation of Conversational Robot Gestures for Human-friendly Steward Robot , 2007, RO-MAN 2007 - The 16th IEEE International Symposium on Robot and Human Interactive Communication.

[19]  Stefan Kopp,et al.  Synthesizing multimodal utterances for conversational agents , 2004, Comput. Animat. Virtual Worlds.

[20]  Axel Gräser,et al.  The Rehabilitation Robots FRIEND-I & II: Daily Life Independency through Semi-Autonomous Task-Execution , 2007 .

[21]  E. Schegloff Structures of Social Action: On some gestures' relation to talk , 1985 .

[22]  Tae-Yong Choi,et al.  Development of a robotic system for the bed-ridden , 2011 .

[23]  ChangHwan Kim,et al.  Adaptation of human motion capture data to humanoid robots for motion imitation using optimization , 2006, Integr. Comput. Aided Eng..

[24]  Rudolf Arnheim,et al.  Hand and Mind: What Gestures Reveal About Thought by David McNeill (review) , 2017 .

[25]  Jaehyung Yang,et al.  MACH : A Supersonic Korean Morphological Analyzer , 2002, COLING.

[26]  Dae-Jin Kim,et al.  Welfare-oriented service robotic systems: Intelligent Sweet Home & KARES II , 2004 .

[27]  Katsushi Ikeuchi,et al.  Humanoid robot motion generation with sequential physical constraints , 2006, Proceedings 2006 IEEE International Conference on Robotics and Automation, 2006. ICRA 2006..