Evaluation of gesture based interfaces for medical volume visualization tasks

Physicians are accustomed to using volumetric datasets for medical assessment, diagnosis and treatment. These modalities can be displayed with 3D computer visualizations for physicians to study the overall shape and internal anatomical structures. Gesture-based interfaces can be beneficial to interact with these kinds of visualizations in a variety of medical settings. We conducted two user studies that explore different gesture-based interfaces for interaction with volume visualizations. The first experiment focused on rotation tasks, where the performance of the gesture-based interface (using Microsoft Kinect) was compared to using the mouse. The second experiment studied localization of internal structures, comparing slice-based visualizations via gestures and the mouse, in addition to a 3D Magic Lens visualization. The results of the user studies showed that the gesture-based interface outperformed the traditional mouse both in time and accuracy in the orientation matching task. The traditional mouse was the better interface for the second experiment in terms of accuracy. However, the gesture-based Magic Lens was found to have the fastest target localization time. We discuss these findings and their further implications in the use of gesture-based interfaces in medical volume visualization.

[1]  Yael Edan,et al.  Gestix: A Doctor-Computer Sterile Gesture Interface for Dynamic Environments , 2007 .

[2]  Stefan Müller,et al.  Hand Gesture Recognition with a Novel IR Time-of-Flight Range Camera-A Pilot Study , 2007, MIRAGE.

[3]  Mary C. Whitton Proceedings of the 20th annual conference on Computer graphics and interactive techniques , 1993, SIGGRAPH.

[4]  James K. Hahn,et al.  Interactive Visualization for Image Guided Medialization Laryngoplasty , 2010 .

[5]  James K. Hahn,et al.  Evaluation of Gesture Based Interfaces for Medical Volume Visualization Tasks , 2012 .

[6]  Andrew S. Forsberg,et al.  Two pointer input for 3D interaction , 1997, SI3D.

[7]  Ken Hinckley,et al.  Passive real-world interface props for neurosurgical visualization , 1994, CHI '94.

[8]  James K. Hahn,et al.  Registration of 3D CT Data to 2D Endoscopic Image using a Gradient Mutual Information based Viewpoint Matching for Image-Guided Medialization Laryngoplasty , 2010, J. Comput. Sci. Eng..

[9]  P ? ? ? ? ? ? ? % ? ? ? ? , 1991 .

[10]  Kenton O'Hara,et al.  Exploring the potential for touchless interaction in image-guided interventional radiology , 2011, CHI.

[11]  Yang-Keun Ahn,et al.  3D spatial touch system based on time-of-flight camera , 2009 .

[12]  Mireille Betrancourt,et al.  The Cambridge Handbook of Multimedia Learning: The Animation and Interactivity Principles in Multimedia Learning , 2005 .

[13]  Ken Hinckley,et al.  Passive real-world interface props for neurosurgical visualization , 1994, International Conference on Human Factors in Computing Systems.

[14]  Giuseppe De Pietro,et al.  3D interaction with volumetric medical data: experiencing the Wiimote , 2008, Ambi-Sys '08.

[15]  Alexander G. Hauptmann,et al.  Speech and gestures for graphic image manipulation , 1989, CHI '89.

[16]  C. Ardito,et al.  Comparing low cost input devices for interacting with 3D Virtual Environments , 2009, 2009 2nd Conference on Human System Interactions.

[17]  Örjan Smedby,et al.  Advanced 3D visualization in student-centred medical education , 2008, Medical teacher.

[18]  Kenton O'Hara,et al.  Social Impact , 2019, Encyclopedia of Food and Agricultural Ethics.

[19]  W. Marsden I and J , 2012 .

[20]  John Viega,et al.  3D magic lenses , 1996, UIST '96.

[21]  Parvati Dev Tutorial: Imaging and Visualization in Medical Education , 1999, IEEE Computer Graphics and Applications.