Demonstration-based control of supernumerary robotic limbs

The body representation in the human mind is dynamic, and illusions or traumatic events can modify it to include additional limbs. This remarkable adaptability of the central nervous system to different body configurations opens new possibilities in the field of human augmentation. In order to fully exploit this potential, we developed a new type of wearable co-robot that can perform tasks in close coordination with the human user. The system, named Supernumerary Robotic Limbs (SRL), consists of two additional robotic arms worn through a backpack-like harness. The SRL can assist the user by holding objects, lifting weights and streamlining the execution of a task. If the SRL perform movements closely coordinated with the user and exhibit human-like dynamics, they might be incorporated into the body representation and perceived as parts of the user's body. As a result, the human would be able to extend the range of available skills and manipulation possibilities, performing tasks more effectively and with less effort. This paper presents a communication, estimation and control method for the SRL, aimed to perform tasks in tight coordination with the wearer. The SRL observes the user motion, and actively assists the human by employing a coordinated control algorithm. In particular, skills involving the direct cooperation of two human workers are transferred to the SRL and a single user. Demonstration data of the two humans - a leader and an assistant - are analyzed and a state estimation algorithm is extracted from them. This can be used to control the SRL accordingly with the used end effectors. A causal relationship relating the assistant's motion to the leader's motion is identified based on System Identification methods. This approach is applied to a drilling operation performed by two workers. An effective coordination skill is identified and transferred to the SRL, to make them act like the human follower.

[1]  B. J. McCarragher,et al.  The discrete event control of robotic assembly tasks , 1992, [1992] Proceedings of the 31st IEEE Conference on Decision and Control.

[2]  Frans C. T. van der Helm,et al.  A Series Elastic- and Bowden-Cable-Based Actuation System for Use as Torque Actuator in Exoskeleton-Type Robots , 2006, Int. J. Robotics Res..

[3]  Brian Williams,et al.  Motion learning in variable environments using probabilistic flow tubes , 2011, 2011 IEEE International Conference on Robotics and Automation.

[4]  A. Bicchi,et al.  Physical human-robot interaction: Dependability, safety, and performance , 2008, 2008 10th IEEE International Workshop on Advanced Motion Control.

[5]  H. Harry Asada,et al.  Automatic program generation from teaching data for the hybrid control of robots , 1989, IEEE Trans. Robotics Autom..

[6]  Hideo Hanafusa,et al.  Playback Control of Force Teachable Robots , 1979 .

[7]  B. J. McCarragher,et al.  Qualitative Template Matching Using Dynamic Process Models for State Transition Recognition of Robotic Assembly , 1993 .

[8]  H. Harry Asada,et al.  The Discrete Event Modeling and Trajectory Planning of Robotic Assembly Tasks , 1995 .

[9]  Masayuki Inaba,et al.  Learning by watching: extracting reusable task knowledge from visual observation of human performance , 1994, IEEE Trans. Robotics Autom..

[10]  R. Riener,et al.  Series Viscoelastic Actuators Can Match Human Force Perception , 2011, IEEE/ASME Transactions on Mechatronics.

[11]  H. Henrik Ehrsson,et al.  The Illusion of Owning a Third Arm , 2011, PloS one.

[12]  V. Ramachandran,et al.  Projecting sensations to external objects: evidence from skin conductance response , 2003, Proceedings of the Royal Society of London. Series B: Biological Sciences.

[13]  M. Tsakiris,et al.  Hands only illusion: multisensory integration elicits sense of ownership for body parts but not for non-corporeal objects , 2010, Experimental Brain Research.

[14]  Jonathan D. Cohen,et al.  Rubber hands ‘feel’ touch that eyes see , 1998, Nature.

[15]  Sheng Liu,et al.  Human Centered Control in Robotics and Consumer Product Design , 1993 .

[16]  Richard A. Volz,et al.  Acquiring robust, force-based assembly skills from human demonstration , 2000, IEEE Trans. Robotics Autom..

[17]  Alain Belli,et al.  Characterization of the mechanical properties of backpacks and their influence on the energetics of walking. , 2009, Journal of biomechanics.

[18]  Katsushi Ikeuchi,et al.  A sensor fusion approach for recognizing continuous human grasping sequences using hidden Markov models , 2005, IEEE Transactions on Robotics.

[19]  Matthew M. Williamson,et al.  Series elastic actuators , 1995, Proceedings 1995 IEEE/RSJ International Conference on Intelligent Robots and Systems. Human Robot Interaction and Cooperative Robots.

[20]  Brian A. Garner,et al.  Estimation of Musculotendon Properties in the Human Upper Limb , 2003, Annals of Biomedical Engineering.