Gesture Interaction and Evaluation Using the Leap Motion for Medical Visualization

In this paper, we present and evaluate an interactive gesture controlled application using the Leap Motion for medical visualization, focusing on user satisfaction as an important component in the composition of the application success factors. Usability testings were conducted to verify important application requirements, among which, the asepsis in the working environment, accuracy of the interaction gestures, interaction time, level of interactivity, naturalness, effectiveness, ease of use and of learning, visual quality of the interface, utility, satisfaction and the non-occurrence of fatigue (physical and mental). The results show the effectiveness of the application in the recognition process of the modeled gestures and a very high level of overall satisfaction of the participants, indicating its strong potential as a touchless support tool in medical tasks guided by radiological images conducted in operating rooms.

[1]  Joseph J. LaViola,et al.  3D Gestural Interaction: The State of the Field , 2013 .

[2]  C. Law,et al.  Using a depth-sensing infrared camera system to access and manipulate medical imaging from within the sterile operating field. , 2013, Canadian journal of surgery. Journal canadien de chirurgie.

[3]  Fons J. Verbeek,et al.  Pointing Task Evaluation of Leap Motion Controller in 3D Virtual Environment , 2014 .

[4]  Grzegorz Glonek,et al.  Natural User Interfaces (NUI): review , 2012 .

[5]  Jorge C. S. Cardoso,et al.  One Hand or Two Hands? 2D Selection Tasks With the Leap Motion Device , 2015, ACHI 2015.

[6]  Hal Hodson Leap Motion hacks show potential of new gesture tech , 2013 .

[7]  Taehyub Kim,et al.  Using Binary Decision Tree and Multiclass SVM for Human Gesture Recognition , 2013, 2013 International Conference on Information Science and Applications (ICISA).

[8]  Junsong Yuan,et al.  Robust Part-Based Hand Gesture Recognition Using Kinect Sensor , 2013, IEEE Transactions on Multimedia.

[9]  Rafael Ballagas,et al.  Unravelling seams: improving mobile gesture recognition with visual feedback techniques , 2009, CHI.

[10]  Lars C. Ebert,et al.  Out of touch – A plugin for controlling OsiriX with gestures using the leap controller , 2014 .

[11]  Kenton O'Hara,et al.  Touchless interaction in surgery , 2014, CACM.

[12]  Venu Govindaraju,et al.  A Framework for Hand Gesture Recognition and Spotting Using Sub-gesture Modeling , 2010, 2010 20th International Conference on Pattern Recognition.

[13]  A. Roy,et al.  Enhancing effectiveness of motor rehabilitation using kinect motion sensing technology , 2013, 2013 IEEE Global Humanitarian Technology Conference: South Asia Satellite (GHTC-SAS).

[14]  Édimo Sousa Silva,et al.  Design and Evaluation of a Gesture-Controlled System for Interactive Manipulation of Medical Images and 3D Models , 2014 .

[15]  Joze Guna,et al.  An Analysis of the Precision and Reliability of the Leap Motion Sensor and Its Suitability for Static and Dynamic Tracking , 2014, Sensors.

[16]  Luca Benini,et al.  Gesture Recognition in Ego-centric Videos Using Dense Trajectories and Hand Segmentation , 2014, 2014 IEEE Conference on Computer Vision and Pattern Recognition Workshops.

[17]  Gerald Albaum,et al.  The Likert Scale Revisited , 1997 .

[18]  Bart Jansen,et al.  Serious games for physical rehabilitation: Designing highly configurable and adaptable games , 2012 .

[19]  Ahmad Hoirul Basori,et al.  Interactive Hand and Arm Gesture Control for 2D Medical Image and 3D Volumetric Medical Visualization , 2013 .

[20]  Frank Weichert,et al.  Analysis of the Accuracy and Robustness of the Leap Motion Controller , 2013, Sensors.

[21]  Gerhard Rinkenauer,et al.  Evaluation of the Leap Motion Controller as a New Contact-Free Pointing Device , 2014, Sensors.

[22]  Kin Fun Li,et al.  Telerehabilitation Using Low-Cost Video Game Controllers , 2013, 2013 Seventh International Conference on Complex, Intelligent, and Software Intensive Systems.

[23]  Nassir Navab,et al.  Learning Gestures for Customizable Human-Computer Interaction in the Operating Room , 2011, MICCAI.