Extending the Body to Virtual Tools Using a Robotic Surgical Interface: Evidence from the Crossmodal Congruency Task

The effects of real-world tool use on body or space representations are relatively well established in cognitive neuroscience. Several studies have shown, for example, that active tool use results in a facilitated integration of multisensory information in peripersonal space, i.e. the space directly surrounding the body. However, it remains unknown to what extent similar mechanisms apply to the use of virtual-robotic tools, such as those used in the field of surgical robotics, in which a surgeon may use bimanual haptic interfaces to control a surgery robot at a remote location. This paper presents two experiments in which participants used a haptic handle, originally designed for a commercial surgery robot, to control a virtual tool. The integration of multisensory information related to the virtual-robotic tool was assessed by means of the crossmodal congruency task, in which subjects responded to tactile vibrations applied to their fingers while ignoring visual distractors superimposed on the tip of the virtual-robotic tool. Our results show that active virtual-robotic tool use changes the spatial modulation of the crossmodal congruency effects, comparable to changes in the representation of peripersonal space observed during real-world tool use. Moreover, when the virtual-robotic tools were held in a crossed position, the visual distractors interfered strongly with tactile stimuli that was connected with the hand via the tool, reflecting a remapping of peripersonal space. Such remapping was not only observed when the virtual-robotic tools were actively used (Experiment 1), but also when passively held the tools (Experiment 2). The present study extends earlier findings on the extension of peripersonal space from physical and pointing tools to virtual-robotic tools using techniques from haptics and virtual reality. We discuss our data with respect to learning and human factors in the field of surgical robotics and discuss the use of new technologies in the field of cognitive neuroscience.

[1]  M. Tanaka,et al.  Coding of modified body schema during tool use by macaque postcentral neurones. , 1996, Neuroreport.

[2]  C. Spence,et al.  Visual Capture of Touch: Out-of-the-Body Experiences With Rubber Gloves , 2000, Psychological science.

[3]  A. Iriki,et al.  Acquisition and development of monkey tool-use: behavioral and kinematic analyses. , 2000, Canadian journal of physiology and pharmacology.

[4]  R. Stone Haptic Feedback : A Potted History , From Telepresence to Virtual Reality , 2000 .

[5]  Robert J. Stone,et al.  Haptic Feedback: A Brief History from Telepresence to Virtual Reality , 2000, Haptic Human-Computer Interaction.

[6]  Atsushi Iriki,et al.  Self-images in the video monitor coded by monkey intraparietal neurons , 2001, Neuroscience Research.

[7]  C. Spence,et al.  Tool-use changes multimodal spatial interactions between vision and touch in normal humans , 2002, Cognition.

[8]  Nathaniel J Soper,et al.  The effect of robotic assistance on learning curves for basic laparoscopic skills. , 2002, American journal of surgery.

[9]  Pedro J del Nido,et al.  Robotic pediatric cardiac surgery: present and future perspectives. , 2004, American journal of surgery.

[10]  J. T. Dennerlein,et al.  Ergonomics and human factors in endoscopic surgery: a comparison of manual vs telerobotic simulation systems , 2005, Surgical Endoscopy And Other Interventional Techniques.

[11]  C. Spence,et al.  Extending or projecting peripersonal space with tools? Multisensory interactions highlight only the distal and proximal ends of tools , 2004, Neuroscience Letters.

[12]  A. Maravita,et al.  Tools for the body (schema) , 2004, Trends in Cognitive Sciences.

[13]  Atsushi Iriki,et al.  Shaping multisensory action–space with tools: evidence from patients with cross-modal extinction , 2005, Neuropsychologia.

[14]  T Frede,et al.  Robotics and telesurgery – an update on their position in laparoscopic radical prostatectomy , 2005, Minimally invasive therapy & allied technologies : MITAT : official journal of the Society for Minimally Invasive Therapy.

[15]  Charles Spence,et al.  Temporal aspects of the visuotactile congruency effect , 2006, Neuroscience Letters.

[16]  C. Spence,et al.  Beyond the body schema: Visual, prosthetic, and technological contributions to bodily perception and awareness , 2006 .

[17]  A Maravita,et al.  From "body in the brain" to "body in space". Sensory and intentional components of body representation , 2006 .

[18]  D Oleynikov,et al.  Robotic surgery training and performance: identifying objective variables for quantifying the extent of proficiency. , 2006, Surgical endoscopy.

[19]  Mahdi Tavakoli,et al.  High-Fidelity Bilateral Teleoperation Systems and the Effect of Multimodal Haptics , 2007, IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics).

[20]  Nicholas P. Holmes,et al.  Tool-Use: Capturing Multisensory Spatial Attention or Extending Multisensory Peripersonal Space? , 2007, Cortex.

[21]  S. Kitazawa,et al.  Referral of tactile stimuli to action points in virtual reality with reaction force , 2007, Neuroscience Research.

[22]  Andrea Serino,et al.  Action-dependent plasticity in peripersonal space representations , 2008, Cognitive neuropsychology.

[23]  A. Okamura Haptic feedback in robot-assisted minimally invasive surgery , 2009, Current opinion in urology.

[24]  Olaf Blanke,et al.  Keeping in Touch with One's Self: Multisensory Mechanisms of Self-Consciousness , 2009, PloS one.

[25]  G. Goldenberg,et al.  The neural basis of tool use. , 2009, Brain : a journal of neurology.

[26]  G. Orban,et al.  The Representation of Tool Use in Humans and Monkeys: Common and Uniquely Human Features , 2009, The Journal of Neuroscience.

[27]  Karin Coninx,et al.  Adaptation in virtual environments: conceptual framework and user models , 2010, Multimedia Tools and Applications.

[28]  Regine Zopf,et al.  Crossmodal congruency measures of lateral distance effects on the rubber hand illusion , 2010, Neuropsychologia.

[29]  Roger Gassert,et al.  Influence of force and torque feedback on operator performance in a VR-based suturing task , 2010 .

[30]  Andrea Serino,et al.  Everyday use of the computer mouse extends peripersonal space representation , 2010, Neuropsychologia.

[31]  B. Hannaford,et al.  Surgical robotics : systems, applications and visions , 2011 .

[32]  H. Kenngott,et al.  Status of robotic assistance—a less traumatic and more accurate minimally invasive surgery? , 2012, Langenbeck's Archives of Surgery.

[33]  Nicholas P. Holmes,et al.  Does tool use extend peripersonal space? A review and re-analysis , 2012, Experimental Brain Research.

[34]  Hannes Bleuler,et al.  Survey on Surgical Instrument Handle Design , 2012, Surgical innovation.

[35]  A. Iriki,et al.  Triadic (ecological, neural, cognitive) niche construction: a scenario of human brain evolution extrapolating tool use and language from the control of reaching actions , 2012, Philosophical Transactions of the Royal Society B: Biological Sciences.

[36]  G. Rognini,et al.  Visuo‐tactile integration and body ownership during self‐generated action , 2013, The European journal of neuroscience.