An adaptive solution for intra-operative gesture-based human-machine interaction

Computerized medical systems play a vital role in the operating room, however, sterility requirements and interventional workflow often make interaction with these devices challenging for surgeons. Typical solutions, such as delegating physical control of keyboard and mouse to assistants, add an undesirable level of indirection. We present a touchless, gesture-based interaction framework for the operating room that lets surgeons define a personalized set of gestures for controlling arbitrary medical computerized systems. Instead of using cameras for capturing gestures, we rely on a few wireless inertial sensors, placed on the arms of the surgeon, eliminating the dependence on illumination and line-of-sight. A discriminative gesture recognition approach based on kernel regression allows us to simultaneously classify performed gestures and to track the relative spatial pose within each gesture, giving surgeons fine-grained control of continuous parameters. An extensible software architecture enables a dynamic association of learned gestures to arbitrary intraoperative computerized systems. Our experiments illustrate the performance of our approach and encourage its practical applicability.

[1]  Jean Vanderdonckt,et al.  An open source workbench for prototyping multimodal interactions based on off-the-shelf heterogeneous components , 2009, EICS '09.

[2]  Nicu Sebe,et al.  Multimodal Human Computer Interaction: A Survey , 2005, ICCV-HCI.

[3]  Eigil Samset,et al.  Collaborative co-design of emerging multi-technologies for surgery , 2011, J. Biomed. Informatics.

[4]  Ulrich G. Hofmann,et al.  Touch- and marker-free interaction with medical software , 2009 .

[5]  Marcelo Knörich Zuffo,et al.  On the usability of gesture interfaces in virtual reality environments , 2005, CLIHC '05.

[6]  Norbert Link,et al.  Gesture recognition with inertial sensors and optimized DTW prototypes , 2010, 2010 IEEE International Conference on Systems, Man and Cybernetics.

[7]  Joachim Hornegger,et al.  3-D gesture-based scene navigation in medical imaging applications using Time-of-Flight cameras , 2008, 2008 IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops.

[8]  Patrick Baudisch,et al.  Imaginary interfaces: spatial interaction with empty hands and without visual feedback , 2010, UIST.

[9]  Kongqiao Wang,et al.  Hand gesture recognition and virtual game control based on 3D accelerometer and EMG sensors , 2009, IUI.

[10]  Nassir Navab,et al.  OR Specific Domain Model for Usability Evaluations of Intra-operative Systems , 2011, IPCAI.

[11]  Nassir Navab,et al.  Quantification of abdominal aortic deformation after EVAR , 2009, Medical Imaging.

[12]  C Baur,et al.  A non-contact mouse for surgeon-computer interaction. , 2004, Technology and health care : official journal of the European Society for Engineering and Medicine.

[13]  Ahmed M. Elgammal,et al.  The Role of Manifold Learning in Human Motion Analysis , 2006, Human Motion.

[14]  M. Urban,et al.  Recognition of arm gestures using multiple orientation sensors: repeatability assessment , 2004, Proceedings. The 7th International IEEE Conference on Intelligent Transportation Systems (IEEE Cat. No.04TH8749).

[15]  Alois Ferscha,et al.  Orientation sensing for gesture-based interaction with smart artifacts , 2005, Comput. Commun..

[16]  Luigi Gallo,et al.  A Glove-Based Interface for 3D Medical Image Visualization , 2010 .

[17]  T. Yeo,et al.  A COLLABORATIVE VIRTUAL REALITY ENVIRONMENT FOR NEUROSURGICAL PLANNING AND TRAINING , 2007, Neurosurgery.

[18]  Kenton O'Hara,et al.  Exploring the potential for touchless interaction in image-guided interventional radiology , 2011, CHI.

[19]  Jani Mäntyjärvi,et al.  Accelerometer-based gesture control for a design environment , 2006, Personal and Ubiquitous Computing.

[20]  Yael Edan,et al.  A Real-Time Hand Gesture Interface for Medical Visualization Applications , 2006 .

[21]  Mikhail Belkin,et al.  Laplacian Eigenmaps for Dimensionality Reduction and Data Representation , 2003, Neural Computation.

[22]  S. Mitra,et al.  Gesture Recognition: A Survey , 2007, IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews).

[23]  P. Gray,et al.  A Demonstration of the OpenInterface Interaction Development Environment , 2007 .

[24]  Yang Li,et al.  Gestures without libraries, toolkits or training: a $1 recognizer for user interface prototypes , 2007, UIST.

[25]  Hermann Kaindl,et al.  Modeling of interaction design by end users through discourse modeling , 2008, IUI '08.