Touchless interaction with medical images based on 3D hand cursors supported by single-foot input: A case study in dentistry

[1]  Gavin Doherty,et al.  Touchless computer interfaces in hospitals: A review , 2018, Health Informatics J..

[2]  Joaquim A. Jorge,et al.  FEETICHE: FEET Input for Contactless Hand gEsture Interaction , 2019, VRCAI.

[3]  David B. Douglas,et al.  A systematic review of 3D cursor in the medical literature , 2018 .

[4]  Norbert Elkmann,et al.  GazeTap: towards hands-free interaction in the operating room , 2017, ICMI.

[5]  Joaquim A. Jorge,et al.  On the utility of 3D hand cursors to explore medical volume datasets with a touchless interface , 2017, J. Biomed. Informatics.

[6]  Gabriel Zachmann,et al.  Virtual Reality for User-Centered Design and Evaluation of Touch-free Interaction Techniques for Navigating Medical Images in the Operating Room , 2017, CHI Extended Abstracts.

[7]  Patrick Saalfeld,et al.  Comparison of gesture and conventional interaction techniques for interventional neuroradiology , 2017, International Journal of Computer Assisted Radiology and Surgery.

[8]  Ron Kikinis,et al.  Auditory display as feedback for a novel eye-tracking system for sterile operating room interaction , 2017, International Journal of Computer Assisted Radiology and Surgery.

[9]  Christian Hansen,et al.  Touchless interaction with software in interventional radiology and surgery: a systematic literature review , 2017, International Journal of Computer Assisted Radiology and Surgery.

[10]  Daniel Mendes,et al.  The benefits of DOF separation in mid-air 3D object manipulation , 2016, VRST.

[11]  Wan Nurazreena Wan Hassan,et al.  User acceptance of a touchless sterile system to control virtual orthodontic study models. , 2016, American journal of orthodontics and dentofacial orthopedics : official publication of the American Association of Orthodontists, its constituent societies, and the American Board of Orthodontics.

[12]  Simon Weidert,et al.  Device- and system-independent personal touchless user interface for operating rooms , 2016, International Journal of Computer Assisted Radiology and Surgery.

[13]  G. Guimarães,et al.  944 A new gesture-controlled tool using three-dimensional reconstruction of renovascular-collecting system-tumor anatomy to assist navigation of kidney during “zero ischemia” minimally invasive nephron sparing surgery in high complex renal cancer , 2016 .

[14]  Daniel Vogel,et al.  The performance of indirect foot pointing using discrete taps and kicks while standing , 2015, Graphics Interface.

[15]  Maria A Mora,et al.  Software tools and surgical guides in dental-implant-guided surgery. , 2014, Dental clinics of North America.

[16]  Guillermo M. Rosa,et al.  Use of a gesture user interface as a touchless image navigation system in dental surgery: Case series report , 2014, Imaging science in dentistry.

[17]  Hans-Werner Gellersen,et al.  Feet movement in desktop 3D interaction , 2014, 2014 IEEE Symposium on 3D User Interfaces (3DUI).

[18]  Juan Pablo Wachs,et al.  Context-based hand gesture recognition for the operating room , 2014, Pattern Recognit. Lett..

[19]  Kenton O'Hara,et al.  Touchless interaction in surgery , 2014, CACM.

[20]  Thomas Pederson,et al.  Touch-less interaction with medical images using hand & foot gestures , 2013, UbiComp.

[21]  Raimund Dachselt,et al.  Gaze-supported foot interaction in zoomable information spaces , 2013, CHI Extended Abstracts.

[22]  Karthik Ramani,et al.  Shape-It-Up: Hand gesture based creative expression of 3D shapes using intelligent generalized cylinders , 2013, Comput. Aided Des..

[23]  Stepán Obdrzálek,et al.  Accuracy and robustness of Kinect pose estimation in the context of coaching of elderly population , 2012, 2012 Annual International Conference of the IEEE Engineering in Medicine and Biology Society.

[24]  Hongan Wang,et al.  Foot Menu: Using Heel Rotation Information for Menu Selection , 2011, 2011 15th Annual International Symposium on Wearable Computers.

[25]  Andrew Wilson,et al.  Data miming: inferring spatial object descriptions from human gesture , 2011, CHI.

[26]  Kenton O'Hara,et al.  Exploring the potential for touchless interaction in image-guided interventional radiology , 2011, CHI.

[27]  Jeremy Scott,et al.  Sensing foot gestures from the pocket , 2010, UIST.

[28]  Patrick Baudisch,et al.  Multitoe: high-precision interaction with back-projected floors based on high-resolution multi-touch input , 2010, UIST.

[29]  Stephen A. Brewster,et al.  Foot tapping for mobile interaction , 2010, BCS HCI.

[30]  Benjamin B. Bederson,et al.  Multi-modal text entry and selection on a mobile device , 2010, Graphics Interface.

[31]  Patrick Reuter,et al.  The activation of modality in virtual objects assembly , 2010, Journal on Multimodal User Interfaces.

[32]  Johannes Schöning,et al.  Using hands and feet to navigate and manipulate spatial data , 2009, CHI Extended Abstracts.

[33]  A. F. Rovers,et al.  Guidelines for haptic interpersonal communication applications: an exploration of foot interaction styles , 2006, Virtual Reality.

[34]  R. Marklin,et al.  Working postures of dentists and dental hygienists. , 2005, Journal of the California Dental Association.

[35]  Roope Raisamo,et al.  Appropriateness of foot interaction for non-accurate spatial tasks , 2004, CHI EA '04.

[36]  D. Roessler Complete denture success for patients and dentists. , 2003, International dental journal.

[37]  George W. Fitzmaurice,et al.  Exploring interactive curve and surface manipulation using a bend and twist sensitive input strip , 1999, SI3D.

[38]  P. Ashworth,et al.  Psychological effects of aesthetic dental treatment. , 1998, Journal of dentistry.

[39]  G. Pearson,et al.  Of moles and men: the design of foot controls for workstations , 1986, CHI '86.

[40]  N. Charlton Orthodontic study models , 1985, British Dental Journal.