Development and Initial Clinical Testing of “OPECT”: An Innovative Device for Fully Intangible Control of the Intraoperative Image-Displaying Monitor by the Surgeon

BACKGROUND: During surgery, various images as well as other relevant visual information are usually shown upon request with the help of operating staff. However, the lack of direct control over the display may represent a source of stress for surgeons, particularly when fast decision making is needed. OBJECTIVE: To present the development and initial clinical testing of an innovative device that enables surgeons to have direct intangible control of the intraoperative image-displaying monitor with standardized free-hand movements. METHODS: The originally developed intangible interface named “OPECT” is based on the commercially available gaming controller KINECT (Microsoft) and dedicated action-recognizing algorithm. The device does not require any sensors or markers fixed on the hands. Testing was done during 30 neurosurgical operations. After each procedure, surgeons completed the 5-item questionnaire for evaluation of the system performance, scaling several parameters from 1 (bad) to 5 (excellent). RESULTS: During surgical procedures, OPECT demonstrated high effectiveness and simplicity of use, excellent quality of visualized graphics, and precise recognition of the individual user profile. In all cases, the surgeons were well satisfied with performance of the device. The mean score value of answers to the questionnaire was 4.7 ± 0.2. CONCLUSION: OPECT enables the surgeon to easily have intangible control of the intraoperative image monitor by using standardized free-hand movements. The system has promising potential to be applied for various kinds of distant manipulations with the displaying visual information during human activities.

[1]  Hiroshi Iseki,et al.  Information-guided surgical management of gliomas using low-field-strength intraoperative MRI. , 2011, Acta neurochirurgica. Supplement.

[2]  Maribeth Gandy Coleman,et al.  The Gesture Pendant: A Self-illuminating, Wearable, Infrared Computer Vision System for Home Automation Control and Medical Monitoring , 2000, Digest of Papers. Fourth International Symposium on Wearable Computers.

[3]  Kazuhiro Hongo,et al.  Operation-microscope-mounted touch display tablet computer for intraoperative imaging visualization. , 2012, World neurosurgery.

[4]  Huosheng Hu,et al.  Human motion tracking for rehabilitation - A survey , 2008, Biomed. Signal Process. Control..

[5]  Ronald Azuma,et al.  Recent Advances in Augmented Reality , 2001, IEEE Computer Graphics and Applications.

[6]  Jacques Marescaux,et al.  Transatlantic robot-assisted telesurgery , 2001, Nature.

[7]  Steven D Schwaitzberg,et al.  Utility of a voice-activated system in minimally invasive surgery. , 2005, Journal of laparoendoscopic & advanced surgical techniques. Part A.

[8]  Ross C. Williams Finger tracking and gesture interfacing using the Nintendo® wiimote , 2010, ACM SE '10.

[9]  Huosheng Hu,et al.  Integration of Vision and Inertial Sensors for 3D Arm Motion Tracking in Home-based Rehabilitation , 2007, Int. J. Robotics Res..

[10]  H Iseki,et al.  Advanced Computer-aided Intraoperative Technologies for Information-guided Surgical Management of Gliomas: Tokyo Women's Medical University Experience , 2008, Minimally invasive neurosurgery : MIN.

[11]  D. G. Caldwell,et al.  PORTABLE ABSOLUTE POSITION TRACKING SYSTEM FOR HUMAN HAND FINGERTIPS , 2006 .

[12]  Philippe Cinquin,et al.  Development of miniaturized light endoscope-holder robot for laparoscopic surgery , 2007, Journal of endourology.

[13]  Hiroshi Ishii,et al.  g-stalt: a chirocentric, spatiotemporal, and telekinetic gestural interface , 2010, TEI '10.

[14]  Darren Leigh,et al.  DiamondTouch: a multi-user touch technology , 2001, UIST '01.

[15]  Huosheng Hu,et al.  Inertial motion tracking of human arm movements in stroke rehabilitation , 2005, IEEE International Conference Mechatronics and Automation, 2005.