Unused information: Detecting and applying eye contact data in computerized healthcare systems

Medical computing systems rely primarily on traditional human-computer interfaces, such as the keyboard, mouse and touch screen, however future systems will incorporate vastly enhanced interaction capabilities. Some of these, such as speech control and eye contact sensing, have begun to appear on the medical computing landscape. Eye contact provides computer systems with a wealth of yet-uncollected information about user attention and attentiveness, and may allow for personalized interfaces, while requiring almost no training to use. This paper introduces an advanced prototype of a gaze-enhanced speech recognition charting system for surgical nurses. We go on to discuss the implications of our system and of gaze detection in general for medical computing.

[1]  L. Rossi,et al.  Automatic Speech Recognition in Vitreo-Retinal Surgery. A Project for a Prototypal Computer-Based Voice-Controlled Vitrectomy Machine , 1996, European journal of ophthalmology.

[2]  C. Breazeal Towards Sociable Robots , 2002 .

[3]  Akikazu Takeuchi,et al.  Situated facial displays: towards social interaction , 1995, CHI '95.

[4]  Myron Flickner,et al.  Real-Time Detection of Eyes and FAces , 1998 .

[5]  Timothy W. Finin,et al.  A Pervasive Computing System for the Operating Room of the Future , 2007, Mob. Networks Appl..

[6]  R. G. Zick,et al.  Voice recognition software versus a traditional transcription service for physician charting in the ED. , 2001, The American journal of emergency medicine.

[7]  Tsukasa Ogasawara,et al.  Drive monitoring system based on non-contact measurement system of driver's focus of visual attention , 2003, IEEE IV2003 Intelligent Vehicles Symposium. Proceedings (Cat. No.03TH8683).

[8]  G. J. Lepinski,et al.  Using eye contact and contextual speech recognition for hands-free surgical charting , 2008, Pervasive 2008.

[9]  B R Lee,et al.  Laparoscopic visual field. Voice vs foot pedal interfaces for control of the AESOP robot. , 1998, Surgical endoscopy.

[10]  Jeffrey S. Shell,et al.  Interacting with groups of computers , 2003, Commun. ACM.

[11]  David G. Novick,et al.  Coordinating turn-taking with gaze , 1996, Proceeding of Fourth International Conference on Spoken Language Processing. ICSLP '96.

[12]  Cynthia Breazeal,et al.  Toward sociable robots , 2003, Robotics Auton. Syst..

[13]  Shumin Zhai,et al.  Gaze and Speech in Attentive User Interfaces , 2000, ICMI.