Can we do without GUIs? Gesture and speech interaction with a patient information system

We have developed a gesture input system that provides a common interaction technique across mobile, wearable and ubiquitous computing devices of diverse form factors. In this paper, we combine our gestural input technique with speech output and test whether or not the absence of a visual display impairs usability in this kind of multimodal interaction. This is of particular relevance to mobile, wearable and ubiquitous systems where visual displays may be restricted or unavailable. We conducted the evaluation using a prototype for a system combining gesture input and speech output to provide information to patients in a hospital Accident and Emergency Department. A group of participants was instructed to access various services using gestural inputs. The services were delivered by automated speech output. Throughout their tasks, these participants could see a visual display on which a GUI presented the available services and their corresponding gestures. Another group of participants performed the same tasks but without this visual display. It was predicted that the participants without the visual display would make more incorrect gestures and take longer to perform correct gestures than the participants with the visual display. We found no significant difference in the number of incorrect gestures made. We also found that participants with the visual display took longer than participants without it. It was suggested that for a small set of semantically distinct services with memorable and distinct gestures, the absence of a GUI visual display does not impair the usability of a system with gesture input and speech output.

[1]  J. Miles,et al.  Managing waiting patients' perceptions: the role of process control. , 2001, Journal of management in medicine.

[2]  Mark Rouncefield,et al.  When a bed is not a bed: the situated display of knowledge on a hospital ward , 2003 .

[3]  Vassilis Kostakos,et al.  Easing the wait in the emergency room: building a theory of public information systems , 2004, DIS '04.

[4]  D. Maister The Psychology of Waiting Lines , 2005 .

[5]  Jens M. Manzke,et al.  Adaptation of a cash dispenser to the needs of blind and visually impaired people , 1998, Assets '98.

[6]  K. Dansky,et al.  Patient satisfaction with ambulatory healthcare services: waiting time and filling time. , 1997, Hospital & health services administration.

[7]  Colin F. Mackenzie,et al.  Cognitive properties of a whiteboard: A case study in a trauma centre , 2001, ECSCW.

[8]  Vassilis Kostakos,et al.  A Directional Stroke Recognition Technique for Mobile Interaction in a Pervasive Computing World , 2003 .

[9]  David A. Ross,et al.  Development of a Wearable Computer Orientation System , 2002, Personal and Ubiquitous Computing.

[10]  Stephen A. Brewster,et al.  Gestural and audio metaphors as a means of control for mobile devices , 2002, CHI.

[11]  G. A. Miller THE PSYCHOLOGICAL REVIEW THE MAGICAL NUMBER SEVEN, PLUS OR MINUS TWO: SOME LIMITS ON OUR CAPACITY FOR PROCESSING INFORMATION 1 , 1956 .

[12]  Vassilis Kostakos,et al.  Pervasive computing in emergency situations , 2004, 37th Annual Hawaii International Conference on System Sciences, 2004. Proceedings of the.

[13]  Daniel J. Wigdor,et al.  TiltText: using tilt for text input to mobile phones , 2003, UIST '03.

[14]  C. Lovelock,et al.  Managing services : marketing, operations, and human resources , 1988 .

[15]  Kate S. Hone,et al.  Older adults' evaluations of speech output , 2002, Assets '02.

[16]  Takeo Igarashi,et al.  An architecture for pen-based interaction on electronic whiteboards , 2000, AVI '00.

[17]  Jean-Michel Hoc,et al.  Towards ecological validity of research in cognitive ergonomics , 2001 .

[18]  Jeff Roelands,et al.  Voice over Workplace (VoWP): voice navigation in a complex business GUI , 2002, Assets '02.

[19]  Chris Schmandt,et al.  Nomadic radio: speech and audio interaction for contextual messaging in nomadic environments , 2000, TCHI.

[20]  Steven Dow,et al.  Mobile ADVICE: an accessible device for visually impaired capability enhancement , 2003, CHI Extended Abstracts.

[21]  A RossDavid,et al.  Development of a Wearable Computer Orientation System , 2002 .

[22]  Mikael Goldstein,et al.  Non-keyboard QWERTY touch typing: a portable input interface for the mobile user , 1999, CHI '99.

[23]  Stephen A. Brewster,et al.  Multimodal 'eyes-free' interaction techniques for wearable devices , 2003, CHI '03.

[24]  Stephen A. Brewster,et al.  Overcoming the Lack of Screen Space on Mobile Computers , 2002, Personal and Ubiquitous Computing.