Reaching and pointing gestures calculated by a generic gesture system for social robots

Since the implementation of gestures for a certain robot generally involves the use of specific information about its morphology, these gestures are not easily transferable to other robots. To cope with this problem, we proposed a generic method to generate gestures, constructed independently of any configuration and therefore useable for different robots. In this paper, we discuss the novel end-effector mode of the method, which can be used to calculate gestures whereby the position of the end-effector is important, for example for reaching for or pointing towards an object. The interesting and innovative feature of our method is its high degree of flexibility in both the possible configurations wherefore the method can be used, as in the gestures to be calculated. The method was validated on several configurations, including those of the robots ASIMO, NAO and Justin. In this paper, the working principles of the end-effector mode are discussed and a number of results are presented. The proposed method can be used to generate gestures for an arbitrary social robot.This paper focuses on how reaching and pointing gestures are calculated.DH-parameters, orientation of the base frames and joint limits are used as input.Joint angles are calculated using IK with a cost-function for natural postures.The method was validated on several robots, including NAO, Justin and ASIMO.

[1]  Jasbir S. Arora,et al.  STUDY OF BI-CRITERION UPPER BODY POSTURE PREDICTION USING PARETO OPTIMAL SETS , 2005 .

[2]  Takayuki Kanda,et al.  Natural deictic communication with humanoid robots , 2007, 2007 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[3]  Michael Gienger,et al.  Task-oriented whole body motion for humanoid robots , 2005, 5th IEEE-RAS International Conference on Humanoid Robots, 2005..

[4]  Tamim Asfour,et al.  Imitation of human motion on a humanoid robot using non-linear optimization , 2008, Humanoids 2008 - 8th IEEE-RAS International Conference on Humanoid Robots.

[5]  Charles A. Klein,et al.  Dexterity Measures for the Design and Control of Kinematically Redundant Manipulators , 1987 .

[6]  C. Stanton,et al.  Teleoperation of a humanoid robot using full-body motion capture , example movements , and machine learning , 2012 .

[7]  Bram Vanderborght,et al.  Development of a generic method to generate upper-body emotional expressions for different social robots , 2015, Adv. Robotics.

[8]  Manfred Hild,et al.  Myon, a New Humanoid , 2012, Language Grounding in Robots.

[9]  T. Takenaka,et al.  The development of Honda humanoid robot , 1998, Proceedings. 1998 IEEE International Conference on Robotics and Automation (Cat. No.98CH36146).

[10]  D Goodman,et al.  On the nature of human interlimb coordination. , 1979, Science.

[11]  John M. Hollerbach,et al.  Redundancy resolution of manipulators through torque optimization , 1987, IEEE J. Robotics Autom..

[12]  K. Dautenhahn,et al.  The correspondence problem , 2002 .

[13]  J. Foley The co-ordination and regulation of movements , 1968 .

[14]  Armin Biess,et al.  A Computational Model for Redundant Human Three-Dimensional Pointing Movements: Integration of Independent Spatial and Temporal Motor Plans Simplifies Movement Dynamics , 2007, The Journal of Neuroscience.

[15]  E. Bizzi,et al.  Human arm trajectory formation. , 1982, Brain : a journal of neurology.

[16]  Takashi Minato,et al.  Generating natural motion in an android by mapping human motion , 2005, 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[17]  Pierre Blazevic,et al.  Mechatronic design of NAO humanoid , 2009, 2009 IEEE International Conference on Robotics and Automation.

[18]  Stefan Kopp,et al.  Generating multi-modal robot behavior based on a virtual agent framework , 2010 .

[19]  A. Takanishi,et al.  Various emotional expressions with emotion expression humanoid robot WE-4RII , 2004, IEEE Conference on Robotics and Automation, 2004. TExCRA Technical Exhibition Based..

[20]  Tsukasa Ogasawara,et al.  Humanoid with Interaction Ability Using Vision and Speech Information , 2006, 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[21]  A. Tapus,et al.  Children with Autism Social Engagement in Interaction with Nao, an Imitative Robot - A Series of Single Case Experiments , 2012 .

[22]  Atsuo Takanishi,et al.  Whole body emotion expressions for KOBIAN humanoid robot — preliminary experiments with different Emotional patterns — , 2009, RO-MAN 2009 - The 18th IEEE International Symposium on Robot and Human Interactive Communication.

[23]  Catherine Pelachaud,et al.  Design and implementation of an expressive gesture model for a humanoid robot , 2011, 2011 11th IEEE-RAS International Conference on Humanoid Robots.

[24]  H. Cruse,et al.  Control of Three- and Four-Joint Arm Movement: Strategies for a Manipulator With Redundant Degrees of Freedom. , 1993, Journal of motor behavior.

[25]  M. Kawato,et al.  Formation and control of optimal trajectory in human multijoint arm movement , 1989, Biological Cybernetics.

[26]  Min K. Chung,et al.  Upper body reach posture prediction for ergonomic evaluation models , 1995 .

[27]  Uri M. Ascher,et al.  Computer methods for ordinary differential equations and differential-algebraic equations , 1998 .

[28]  J. F. Soechting,et al.  Errors in pointing are due to approximations in sensorimotor transformations. , 1989, Journal of neurophysiology.

[29]  C. Atkeson,et al.  Kinematic features of unrestrained vertical arm movements , 1985, The Journal of neuroscience : the official journal of the Society for Neuroscience.

[30]  Christoph Borst,et al.  A Humanoid Two-Arm System for Dexterous Manipulation , 2006, 2006 6th IEEE-RAS International Conference on Humanoid Robots.

[31]  Jean-Claude Latombe,et al.  Planning motions with intentions , 1994, SIGGRAPH.

[32]  Stefan Kopp,et al.  Towards Meaningful Robot Gesture , 2009, Human Centered Robot Systems, Cognition, Interaction, Technology.

[33]  G. Oriolo,et al.  Robotics: Modelling, Planning and Control , 2008 .

[34]  Chrystopher L. Nehaniv,et al.  Imitation with ALICE: learning to imitate corresponding actions across dissimilar embodiments , 2002, IEEE Trans. Syst. Man Cybern. Part A.

[35]  Hirochika Inoue,et al.  Humanoid robotics platforms developed in HRP , 2004, Robotics Auton. Syst..

[36]  P. Morasso Spatial control of arm movements , 2004, Experimental Brain Research.

[37]  Jing Zhao,et al.  Generating human-like movements for robotic arms , 2014 .

[38]  Fabio Tesser,et al.  Multimodal child-robot interaction: building social bonds , 2013, HRI 2013.