Generating human-like motion for robots

Action prediction and fluidity are key elements of human–robot teamwork. If a robot’s actions are hard to understand, it can impede fluid human–robot interaction. Our goal is to improve the clarity of robot motion by making it more human-like. We present an algorithm that autonomously synthesizes human-like variants of an input motion. Our approach is a three-stage pipeline. First we optimize motion with respect to spatiotemporal correspondence (STC), which emulates the coordinated effects of human joints that are connected by muscles. We present three experiments that validate that our STC optimization approach increases human-likeness and recognition accuracy for human social partners. Next in the pipeline, we avoid repetitive motion by adding variance, through exploiting redundant and underutilized spaces of the input motion, which creates multiple motions from a single input. In two experiments we validate that our variance approach maintains the human-likeness from the previous step, and that a social partner can still accurately recognize the motion’s intent. As a final step, we maintain the robot’s ability to interact with its world by providing it the ability to satisfy constraints. We provide experimental analysis of the effects of constraints on the synthesized human-like robot motion variants.

[1]  T. Flash,et al.  The coordination of arm movements: an experimentally confirmed mathematical model , 1985, The Journal of neuroscience : the official journal of the Society for Neuroscience.

[2]  Frank L. Lewis,et al.  Optimal Control , 1986 .

[3]  George A. Bekey,et al.  A strategy for grasp synthesis with multifingered robot hands , 1987, Proceedings. 1987 IEEE International Conference on Robotics and Automation.

[4]  Koichi Kondo,et al.  Inverse Kinematics of a Human Arm , 1994 .

[5]  Michael Gleicher,et al.  Retargetting motion to new characters , 1998, SIGGRAPH.

[6]  Zoran Popovic,et al.  Physically based motion transformation , 1999, SIGGRAPH.

[7]  Yoshihiko Nakamura,et al.  Making feasible walking motion of humanoid robots from human motion capture data , 1999, Proceedings 1999 IEEE International Conference on Robotics and Automation (Cat. No.99CH36288C).

[8]  Karsten Berns,et al.  The Humanoid Robot ARMAR: Design and Control , 2000 .

[9]  Mark S. Nixon,et al.  Recognising human and animal movement by symmetry , 2001, Proceedings 2001 International Conference on Image Processing (Cat. No.01CH37205).

[10]  Toshio Fukuda,et al.  How far away is "artificial man" , 2001, IEEE Robotics & Automation Magazine.

[11]  Thomas B. Moeslund,et al.  A Survey of Computer Vision-Based Human Motion Capture , 2001, Comput. Vis. Image Underst..

[12]  Karsten Berns,et al.  Control of ARMAR for the Realization of Anthropomorphic Motion Patterns , 2001 .

[13]  W. Sparrow,et al.  Practice effects on coordination and control, metabolic energy expenditure, and muscle activation. , 2002, Human movement science.

[14]  Jessica K. Hodgins,et al.  Interactive control of avatars animated with human motion data , 2002, SIGGRAPH.

[15]  C. Karen Liu,et al.  Synthesis of complex dynamic character motion from simple animations , 2002, ACM Trans. Graph..

[16]  Yang Song,et al.  Unsupervised Learning of Human Motion , 2003, IEEE Trans. Pattern Anal. Mach. Intell..

[17]  Nancy S. Pollard,et al.  Perceptual metrics for character animation: sensitivity to errors in ballistic motion , 2003, ACM Trans. Graph..

[18]  Illah R. Nourbakhsh,et al.  A survey of socially interactive robots , 2003, Robotics Auton. Syst..

[19]  Brian R. Duffy,et al.  Anthropomorphism and the social robot , 2003, Robotics Auton. Syst..

[20]  Martin A. Giese,et al.  Morphable Models for the Analysis and Synthesis of Complex Motion Patterns , 2000, International Journal of Computer Vision.

[21]  Barry T. Thomas,et al.  Combining sampling and autoregression for motion synthesis , 2004 .

[22]  Maja J. Mataric,et al.  A spatio-temporal extension to Isomap nonlinear dimension reduction , 2004, ICML.

[23]  Katsu Yamane,et al.  Synthesizing animations of human manipulation tasks , 2004, SIGGRAPH 2004.

[24]  James M. Rehg,et al.  A data-driven approach to quantifying natural human motion , 2005, SIGGRAPH '05.

[25]  James M. Rehg,et al.  A data-driven approach to quantifying natural human motion , 2005, ACM Trans. Graph..

[26]  Timothy Bretl,et al.  Natural Motion Generation for Humanoid Robots , 2006, 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[27]  H. Bekkering,et al.  Joint action: bodies and minds moving together , 2006, Trends in Cognitive Sciences.

[28]  Oussama Khatib,et al.  A whole-body control framework for humanoids operating in human environments , 2006, Proceedings 2006 IEEE International Conference on Robotics and Automation, 2006. ICRA 2006..

[29]  M. Prokopenko,et al.  Evolving Spatiotemporal Coordination in a Modular Robotic System , 2006, SAB.

[30]  Andrea Lockerd Thomaz,et al.  Generating anticipation in robot motion , 2011, 2011 RO-MAN.

[31]  Andrea L. Thomaz,et al.  Anticipation in Robot Motion , 2011 .

[32]  Estela Bicho,et al.  Neuro-cognitive mechanisms of decision making in joint action: a human-robot interaction study. , 2011, Human movement science.