Title Collaborative gaze channelling for improved cooperation duringrobotic assisted surgery

The use of multiple robots for performing complex tasks is becoming a common practice for many robot applications. When different operators are involved, effective cooperation with anticipated manoeuvres is important for seamless, synergistic control of all the end-effectors. In this paper, the concept of Collaborative Gaze Channelling (CGC) is presented for improved control of surgical robots for a shared task. Through eye tracking, the fixations of each operator are monitored and presented in a shared surgical workspace. CGC permits remote or physically separated collaborators to share their intention by visualising the eye gaze of their counterparts, and thus recovers, to a certain extent, the information of mutual intent that we rely upon in a vis-à-vis working setting. In this study, the efficiency of surgical manipulation with and without CGC for controlling a pair of bimanual surgical robots is evaluated by analysing the level of coordination of two independent operators. Fitts’ law is used to compare the quality of movement with or without CGC. A total of 40 subjects have been recruited for this study and the results show that the proposed CGC framework exhibits significant improvement (p< 0.05) on all the motion indices used for quality assessment. This study demonstrates that visual guidance is an implicit yet effective way of communication during collaborative tasks for robotic surgery. Detailed experimental validation results demonstrate the potential clinical value of the proposed CGC framework. Keywords—Robotic surgery, Human–robot interface, Eye tracking, Perceptual docking, Collaborative surgical task. INTRODUCTION In the last two decades, minimally invasive surgery (MIS) has become a matured surgical discipline that reduces scarring, blood loss, and patient recovery time. The introduction of surgical robots has further enhanced manual dexterity, precision, and ergonomic control of MIS. Master–slave systems such as the da Vinci robot, allow the performance of remote procedures by having the surgeon operating through a surgical console with magnified 3D vision combined with motion scaling and seamless control of the EndoWrists inside the patient. Remote collaboration through a common robotic platform has been the main motivation of many early attempts of tele-operation (e.g. Marescaux et al.). Commercial systems such as the da Vinci Si now offer the possibility of two surgeons operating collectively through two separate master consoles to control multiple surgical instruments. Collaborative surgery has several advantages compared to the conventional master–slave approach since it allows several expert surgeons with complementing skills to perform a surgical procedure simultaneously. It permits the sharing of expertise and knowledge whilst enabling each surgeon to manage or lead different parts of the procedure. This brings the current robotic surgery closer to the traditional workflow and is particularly useful for complex tissue manipulation tasks that are beyond the capability of bimanual control of a single surgeon. The platform also permits remote mentoring or assistance, with which the remote expert surgeon can take over a part of the procedure when it is deemed to be too difficult to the local surgeon or trainee. Address correspondence to Ka-Wai Kwok, Department of Computing, Hamlyn Centre for Robotic Surgery, Imperial College London, Bessemer Building, B510 Level 5, South Kensington Campus, London SW7 2BZ, UK. Electronic mail: kkwok@imperial. ac.uk Ka-Wai Kwok and Loi-Wah Sun—joint first authors. Annals of Biomedical Engineering, Vol. 40, No. 10, October 2012 ( 2012) pp. 2156–2167 DOI: 10.1007/s10439-012-0578-4 0090-6964/12/1000-2156/

[1]  Guang-Zhong Yang,et al.  Gaze-Contingent Motor Channelling, haptic constraints and associated cognitive demand for robotic MIS , 2012, Medical Image Anal..

[2]  Guang-Zhong Yang,et al.  Collaborative eye tracking: a potential training tool in laparoscopic surgery , 2012, Surgical Endoscopy.

[3]  Michael J Schwartz,et al.  Eye Metrics as an Objective Assessment of Surgical Skill , 2010, Annals of surgery.

[4]  Guang-Zhong Yang,et al.  Perceptual Docking for Robotic Control , 2008, MIAR.

[5]  Laura E. Thomas,et al.  Moving eyes and moving thought: On the spatial compatibility between eye movements and cognition , 2007, Psychonomic bulletin & review.

[6]  S. Tipper,et al.  Gaze cueing of attention: visual attention, social cognition, and individual differences. , 2007, Psychological bulletin.

[7]  Rajesh Kumar,et al.  Mentoring console improves collaboration and teaching in surgical robotics. , 2006, Journal of laparoendoscopic & advanced surgical techniques. Part A.

[8]  Daniel C. Richardson,et al.  Looking To Understand: The Coupling Between Speakers' and Listeners' Eye Movements and Its Relationship to Discourse Comprehension , 2005, Cogn. Sci..

[9]  Shumin Zhai,et al.  Characterizing computer input with Fitts' law parameters-the information and non-information aspects of pointing , 2004, Int. J. Hum. Comput. Stud..

[10]  H. H. Clark Pointing and placing. , 2003 .

[11]  Jacques Marescaux,et al.  Transatlantic robot-assisted telesurgery , 2001, Nature.

[12]  Zenzi M. Griffin,et al.  PSYCHOLOGICAL SCIENCE Research Article WHAT THE EYES SAY ABOUT SPEAKING , 2022 .

[13]  J. Kelso,et al.  Theoretical concepts and strategies for understanding perceptual-motor skill: from information capacity in closed systems to self-organization in open, nonequilibrium systems. , 1992, Journal of experimental psychology. General.

[14]  Worthy N. Martin,et al.  Human-computer interaction using eye-gaze input , 1989, IEEE Trans. Syst. Man Cybern..

[15]  Stuart K. Card,et al.  Evaluation of mouse, rate-controlled isometric joystick, step keys, and text keys, for text selection on a CRT , 1987 .

[16]  P. Fitts The information capacity of the human motor system in controlling the amplitude of movement. , 1954, Journal of experimental psychology.