Usability analysis of a pointing gesture interface

The low cost of vision sensors and the increasing computational power of "off the shelf" computers provide the conditions for the introduction of vision based human computer interaction techniques in the domestic environment. In this paper we are reporting empirical data about the usability of a pointing device based on gesture languages recognized by simple vision techniques. The results show that the users learn quicker to apply the gesture pointing language than the laptop's touchpad device in a multidirectional point selection task. We discuss the implication of this finding for an applied context such as managing common windows based operating system.

[1]  Nuria Oliver,et al.  GWindows: robust stereo vision for gesture-based control of windows , 2003, ICMI '03.

[2]  Brian C. Lovell,et al.  MIME: a gesture-driven computer interface , 2000, Visual Communications and Image Processing.

[3]  I. Scott MacKenzie,et al.  Accuracy measures for evaluating computer pointing devices , 2001, CHI.

[4]  I. Scott MacKenzie,et al.  Movement time prediction in human-computer interfaces , 1992 .

[5]  Brian W. Epps Comparison of Six Cursor Control Devices Based on Fitts' Law Models , 1986 .

[6]  Terrence Fong,et al.  Novel interfaces for remote driving: gesture, haptic, and PDA , 2001, SPIE Optics East.

[7]  Manuel Graña,et al.  Exploring simple visual languages for real time human-computer interaction , 2003, IEEE International Symposium on Virtual Environments, Human-Computer Interfaces and Measurement Systems, 2003. VECIMS '03. 2003.

[8]  I. Scott MacKenzie,et al.  A tool for the rapid evaluation of input devices using Fitts' law models , 1993, SGCH.

[9]  Philip R. Cohen,et al.  Multimodal speech-gesture interface for handfree painting on a virtual paper using partial recurrent neural networks as gesture recognizer , 2002, Proceedings of the 2002 International Joint Conference on Neural Networks. IJCNN'02 (Cat. No.02CH37290).

[10]  John Karat,et al.  A Comparison of Menu Selection Techniques: Touch Panel, Mouse and Keyboard , 1986, Int. J. Man Mach. Stud..

[11]  Peter Xiaoping Liu,et al.  Visual gesture recognition for human-machine interface of robot teleoperation , 2003, Proceedings 2003 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2003) (Cat. No.03CH37453).

[12]  Alexander Zelinsky,et al.  Finger Track - A Robust and Real-Time Gesture Interface , 1997, Australian Joint Conference on Artificial Intelligence.