An image-based uterus positioning interface using ADALINE networks for robot-assisted hysterectomy

Surgical manipulators are becoming more popular in modern operating theatres. Robots which work side-by-side with the surgeon and perform supportive tasks are one of the examples. However, how to allow the user to control the robot in a user-friendly manner is challenging. In this paper, we present our work on developing an image-based adaptive user interface to control a robot which assist in uterus positioning during laparoscopic hysterectomy for the hand-busy surgeon. The presented interface can be operated in two different modes, the pick and place mode, and the command specifying mode. Under the pick and place mode, the user specifies the desired starting point and ending point of the manipulation with his/her eyes and the robot drives automatically based on these points specified by the user; under the command specifying mode, the user specifies which joint and in which direction to move by looking at features of the laparoscopic monitor, then a driving command would be sent to the robot. Details of these two control approaches and the experimental results demonstrating how they work in uterus positioning are presented.

[1]  Bernard Widrow,et al.  30 years of adaptive neural networks: perceptron, Madaline, and backpropagation , 1990, Proc. IEEE.

[2]  Yunhui Liu,et al.  Development of an eye-gaze controlled interface for surgical manipulators using eye-tracking glasses , 2016, 2016 IEEE International Conference on Robotics and Biomimetics (ROBIO).

[3]  Zerui Wang,et al.  A new robotic uterine positioner for laparoscopic hysterectomy with passive safety mechanisms: Design and experiments , 2015, 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[4]  B R Lee,et al.  Laparoscopic visual field. Voice vs foot pedal interfaces for control of the AESOP robot. , 1998, Surgical endoscopy.

[5]  Worthy N. Martin,et al.  Human-computer interaction using eye-gaze input , 1989, IEEE Trans. Syst. Man Cybern..

[6]  Zerui Wang,et al.  Adaptive image-based positioning of RCM mechanisms using angle and distance features , 2015, 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[7]  Zerui Wang,et al.  Modeling, design and control of an endoscope manipulator for FESS , 2015, 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[8]  John Kenneth Salisbury,et al.  The Intuitive/sup TM/ telesurgery system: overview and application , 2000, Proceedings 2000 ICRA. Millennium Conference. IEEE International Conference on Robotics and Automation. Symposia Proceedings (Cat. No.00CH37065).

[9]  Roel Vertegaal,et al.  EyeWindows: evaluation of eye-controlled zooming windows for focus selection , 2005, CHI.

[10]  Takeo Kanade,et al.  An Iterative Image Registration Technique with an Application to Stereo Vision , 1981, IJCAI.

[11]  Carlo Tomasi,et al.  Good features to track , 1994, 1994 Proceedings of IEEE Conference on Computer Vision and Pattern Recognition.

[12]  Robert G. Moore,et al.  Laparoscopic visual field , 1998, Surgical Endoscopy.