Brain Incorporation of Artificial Limbs and Role of Haptic Feedback

We—as humans—can learn to use tools like hammers, tennis rackets, scalpels or robotic arms. Learning curves to become proficient in using the tools can be very different. Typical questions for learning use of a new tool are the level of complexity in manipulation, i.e. learning time and the ergonomy. We ask here these questions from a cognitive neuroscience perspective: How can we promote fast and natural embodiment of a tool? What are the neuronal mechanisms underlying quick and “natural” incorporation of a tool into the sensory-motor system, with the purpose of gaining proficiency rapidly and efficiently? This approach could benefit practically e.g. design of surgical telemanipulators and at the same time advance knowledge about the sensori-motor control system and learning mechanisms, a topic of interest in neuroprosthetics. We review both behavioral and neurophysiological data and show the importance of a coherent haptic feedback for the emergence of embodiment. We will also present a test platform for studying the mechanisms of incorporation by using advanced haptic interfaces on the master-side and VR environments on the slave side of a telemanipulator aimed at endoscopic surgery.

[1]  N. Hatsopoulos,et al.  Sensing with the Motor Cortex , 2011, Neuron.

[2]  Olaf Blanke,et al.  Keeping in Touch with One's Self: Multisensory Mechanisms of Self-Consciousness , 2009, PloS one.

[3]  D. Yuh,et al.  Application of haptic feedback to robotic surgery. , 2004, Journal of laparoendoscopic & advanced surgical techniques. Part A.

[4]  C. Spence,et al.  Seeing Your Own Touched Hands in a Mirror Modulates Cross-Modal Interactions , 2002, Psychological science.

[5]  Miguel A. L. Nicolelis,et al.  Expanding the primate body schema in sensorimotor cortex by virtual touches of an avatar , 2013, Proceedings of the National Academy of Sciences.

[6]  J. Maunsell,et al.  Touching a Rubber Hand: Feeling of Body Ownership Is Associated with Activity in Multisensory Brain Areas , 2005, The Journal of Neuroscience.

[7]  M. Tanaka,et al.  Coding of modified body schema during tool use by macaque postcentral neurones. , 1996, Neuroreport.

[8]  Keehoon Kim,et al.  Robotic touch shifts perception of embodiment to a prosthesis in targeted reinnervation amputees. , 2011, Brain : a journal of neurology.

[9]  J. Bradshaw,et al.  Mechanisms underlying embodiment, disembodiment and loss of embodiment , 2008, Neuroscience & Biobehavioral Reviews.

[10]  Giulio Rognini,et al.  Force feedback facilitates multisensory integration during robotic tool use , 2013, Experimental Brain Research.

[11]  M. Nicolelis,et al.  Spatiotemporal structure of somatosensory responses of many-neuron ensembles in the rat ventral posterior medial nucleus of the thalamus , 1994, The Journal of neuroscience : the official journal of the Society for Neuroscience.

[12]  A. Meyer Critical review of the data and general methods and deductions of modern neurology , 1898 .

[13]  Alejandro Hernández Arieta,et al.  Body Schema in Robotics: A Review , 2010, IEEE Transactions on Autonomous Mental Development.

[14]  C. Spence,et al.  Visual Capture of Touch: Out-of-the-Body Experiences With Rubber Gloves , 2000, Psychological science.

[15]  Solaiman Shokur,et al.  A Brain-Machine Interface Enables Bimanual Arm Movements in Monkeys , 2013, Science Translational Medicine.

[16]  Hannes Bleuler,et al.  Active tactile exploration enabled by a brain-machine-brain interface , 2011, Nature.

[17]  C. Spence,et al.  Spatial constraints on visual-tactile cross-modal distractor congruency effects , 2004, Cognitive, affective & behavioral neuroscience.

[18]  G. Rizzolatti,et al.  A unifying view of the basis of social cognition , 2004, Trends in Cognitive Sciences.

[19]  Micah M. Murray,et al.  Auditory–somatosensory multisensory interactions in front and rear space , 2007, Neuropsychologia.

[20]  C. Spence,et al.  Extending or projecting peripersonal space with tools? Multisensory interactions highlight only the distal and proximal ends of tools , 2004, Neuroscience Letters.

[21]  A. Okamura Haptic feedback in robot-assisted minimally invasive surgery , 2009, Current opinion in urology.

[22]  G. Rognini,et al.  Extending the Body to Virtual Tools Using a Robotic Surgical Interface: Evidence from the Crossmodal Congruency Task , 2012, PloS one.

[23]  C. Spence,et al.  Tool-use changes multimodal spatial interactions between vision and touch in normal humans , 2002, Cognition.

[24]  Jonathan D. Cohen,et al.  Rubber hands ‘feel’ touch that eyes see , 1998, Nature.

[25]  U. Castiello,et al.  Binding personal and extrapersonal space through body shadows , 2004, Nature Neuroscience.

[26]  Denise Albe-Fessard,et al.  Origine des messages somato-sensitifs activant les cellules du cortex moteur chez le singe , 2004, Experimental Brain Research.

[27]  J Hore,et al.  Relations of motor cortex neural discharge to kinematics of passive and active elbow movements in the monkey. , 1988, Journal of neurophysiology.

[28]  Andrea Serino,et al.  Everyday use of the computer mouse extends peripersonal space representation , 2010, Neuropsychologia.

[29]  J. Tanji,et al.  Reflex and intended responses in motor cortex pyramidal tract neurons of monkey. , 1976, Journal of neurophysiology.

[30]  A. Berti,et al.  When Far Becomes Near: Remapping of Space by Tool Use , 2000, Journal of Cognitive Neuroscience.

[31]  H. Ehrsson,et al.  Upper limb amputees can be induced to experience a rubber hand as their own , 2008, Brain : a journal of neurology.

[32]  Nicholas P. Holmes,et al.  Does tool use extend peripersonal space? A review and re-analysis , 2012, Experimental Brain Research.