Perceptual Docking for Robotic Control

In current robotic surgery, dexterity is enhanced by microprocessor controlled mechanical wrists which allow motion scaling for reduced gross hand movements and improved performance of micro-scale tasks. The continuing evolution of the technology, including force feedback and virtual immobilization through real-time motion adaptation, will permit complex procedures such as beating heart surgery to be carried out under a static frame-of-reference. In pursuing more adaptive and intelligent robotic designs, the regulatory, ethical and legal barriers imposed on interventional surgical robots have given rise to the need of a tightly integrated control between the operator and the robot when autonomy is considered. This paper outlines the general concept of perceptual dockingfor robotic control and how it can be used for learning and knowledge acquisition in robotic assisted minimally invasive surgery such that operator specific motor and perceptual/cognitive behaviour is acquired through in situsensing. A gaze contingent framework is presented in this paper as an example to illustrate how saccadic eye movements and ocular vergence can be used for attention selection, recovering 3D tissue deformation and motor channelling during minimally invasive surgical procedures.

[1]  David A Boas,et al.  Noninvasive measurement of neuronal activity with near-infrared optical imaging , 2004, NeuroImage.

[2]  J.P. Desai,et al.  A General-Purpose 7 DOF Haptic Device: Applications Toward Robot-Assisted Surgery , 2007, IEEE/ASME Transactions on Mechatronics.

[3]  Guang-Zhong Yang,et al.  Gaze-Contingent Soft Tissue Deformation Tracking for Minimally Invasive Robotic Surgery , 2005, MICCAI.

[4]  Jaydev P. Desai,et al.  A Novel Approach to Robotic Cardiac Surgery Using Haptics and Vision , 2002 .

[5]  P. Dupont,et al.  Trajectory Optimization for Dynamic Needle Insertion , 2005, Proceedings of the 2005 IEEE International Conference on Robotics and Automation.

[6]  Rajesh Aggarwal,et al.  Optical Mapping of the Frontal Cortex During a Surgical Knot-Tying Task, a Feasibility Study , 2006, MIAR.

[7]  Guang-Zhong Yang,et al.  Gaze Contingent Depth Recovery and Motion Stabilisation for Minimally Invasive Robotic Surgery , 2004, MIAR.

[8]  Abdulmotaleb El-Saddik,et al.  Haptic-Based Biometrics: A Feasibility Study , 2006, 2006 14th Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems.

[9]  Guang-Zhong Yang,et al.  pq-space Based Non-Photorealistic Rendering for Augmented Reality , 2007, MICCAI.

[10]  Christian Laugier,et al.  Tissue Cutting Using Finite Elements and Force Feedback , 2003, IS4TH.

[11]  Nicholas Ayache,et al.  Medical Image Computing and Computer-Assisted Intervention - MICCAI 2007, 10th International Conference, Brisbane, Australia, October 29 - November 2, 2007, Proceedings, Part I , 2007, MICCAI.

[12]  Guang-Zhong Yang,et al.  Visual search: psychophysical models, practical applications , 2002, Image Vis. Comput..

[13]  Allison M. Okamura,et al.  Methods for haptic feedback in teleoperated robot-assisted surgery , 2004 .

[14]  Nikolaos G. Tsagarakis,et al.  A haptic-enabled multimodal interface for the planning of hip arthroplasty , 2006, IEEE MultiMedia.

[15]  Guang-Zhong Yang,et al.  Changes in prefrontal cortical behaviour depend upon familiarity on a bimanual co-ordination task: An fNIRS study , 2008, NeuroImage.

[16]  Guang-Zhong Yang,et al.  Functional Near Infrared Spectroscopy in Novice and Expert Surgeons - A Manifold Embedding Approach , 2007, MICCAI.

[17]  Guido Gerig,et al.  Medical Image Computing and Computer-Assisted Intervention - MICCAI 2005, 8th International Conference, Palm Springs, CA, USA, October 26-29, 2005, Proceedings, Part I , 2005, MICCAI.

[18]  Louis B. Rosenberg,et al.  Virtual fixtures: Perceptual tools for telerobotic manipulation , 1993, Proceedings of IEEE Virtual Reality Annual International Symposium.

[19]  Rajesh Aggarwal,et al.  Could Variations in Technical Skills Acquisition in Surgery Be Explained by Differences in Cortical Plasticity? , 2008, Annals of surgery.

[20]  M.M. Moore,et al.  Real-world applications for brain-computer interface technology , 2003, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[21]  Cuntai Guan,et al.  Temporal classification of multichannel near-infrared spectroscopy signals of motor imagery for developing a brain–computer interface , 2007, NeuroImage.

[22]  Shirley M Coyle,et al.  Brain–computer interface using a simplified functional near-infrared spectroscopy system , 2007, Journal of neural engineering.

[23]  S. Meagher Instant neural control of a movement signal , 2002 .

[24]  Soo-Young Lee,et al.  Brain–computer interface using fMRI: spatial navigation by thoughts , 2004, Neuroreport.

[25]  Hongen Liao,et al.  Medical Imaging and Augmented Reality , 2004 .

[26]  Allison M. Okamura,et al.  A Velocity-Dependent Model for Needle Insertion in Soft Tissue , 2005, MICCAI.

[27]  Jonathan R Wolpaw,et al.  Control of a two-dimensional movement signal by a noninvasive brain-computer interface in humans. , 2004, Proceedings of the National Academy of Sciences of the United States of America.

[28]  John Kenneth Salisbury,et al.  Haptic Rendering - Beyond Visual Computing , 2004, IEEE Computer Graphics and Applications.

[29]  T. Muroga,et al.  Application to robot control using brain function measurement by near-infrared spectroscopy , 2007, 2007 29th Annual International Conference of the IEEE Engineering in Medicine and Biology Society.

[30]  El Saddik,et al.  The Potential of Haptics Technologies , 2007, IEEE Instrumentation & Measurement Magazine.

[31]  Guang-Zhong Yang,et al.  Gaze-Contingent Motor Channelling and Haptic Constraints for Minimally Invasive Robotic Surgery , 2008, MICCAI.

[32]  Roberta L. Klatzky,et al.  JND Analysis of Texture Roughness Perception using a Magnetic Levitation Haptic Device , 2007, Second Joint EuroHaptics Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems (WHC'07).