An Experimental Multimedia System Allowing 3-D Visualization and Eye-Controlled Interaction Without User-Worn Devices

In this paper, a new kind of human-computer interface allowing three-dimensional (3-D) visualization of multimedia objects and eye controlled interaction is proposed. In order to explore the advantages and limitations of the concept, a prototype system has been set up. The testbed includes a visual operating system for integrating novel forms of interaction with a 3-D graphic user interface, autostereoscopic (free-viewing) 3-D displays with close adaptation to the mechanisms of binocular vision, and solutions for nonintrusive eye-controlled interaction (video-based head and gaze tracking). The paper reviews the system's key components and outlines various applications implemented for user testing. Preliminary results show that most of the users are impressed by a 3-D graphic user interface and the possibility to communicate with a computer by simply looking at the object of interest. On the other hand, the results emphasize the need for a more intelligent interface agent to avoid misinterpretation of the user's eye-controlled input and to reset undesired activities.

[1]  Jock D. Mackinlay,et al.  Cone Trees: animated 3D visualizations of hierarchical information , 1991, CHI.

[2]  Aries Arditi,et al.  Singleness of vision and the initial appearance of binocular disparity , 1978, Vision Research.

[3]  A. B. Nutt Binocular vision. , 1945, The British orthoptic journal.

[4]  Rama Chellappa,et al.  Human and machine recognition of faces: a survey , 1995, Proc. IEEE.

[5]  Shojiro Nagata,et al.  How to reinforce perception of depth in single two-dimensional pictures , 1991 .

[6]  Arne John Glenstrup,et al.  Eye Controlled Media: Present and Future State , 1995 .

[7]  Vladimir Pavlovic,et al.  Visual Interpretation of Hand Gestures for Human-Computer Interaction: A Review , 1997, IEEE Trans. Pattern Anal. Mach. Intell..

[8]  Jin Liu,et al.  Eye and gaze tracking for visually controlled interactive stereoscopic displays , 1999, Signal Process. Image Commun..

[9]  Thomas Sikora,et al.  A family of single-user autostereoscopic displays with head-tracking capabilities , 2000, IEEE Trans. Circuits Syst. Video Technol..

[10]  S. Pastoor,et al.  Stereoscopic image representation with synthetic depth of field , 1997 .

[11]  Jin Liu Determination of the point of fixation in a head-fixed coordinate system , 1998, Proceedings. Fourteenth International Conference on Pattern Recognition (Cat. No.98EX170).

[12]  Matthias Wöpking,et al.  Viewing comfort with stereoscopic pictures : an experimental study on the subjective effects of disparity magnitude and depth of focus , 1995 .

[13]  Matthias Wöpking,et al.  3-D displays: A review of current technologies , 1997 .

[14]  L. Young,et al.  Survey of eye movement recording methods , 1975 .

[15]  Alex Pentland,et al.  LAFTER: lips and face real time tracker , 1997, Proceedings of IEEE Computer Society Conference on Computer Vision and Pattern Recognition.

[16]  Jakob Nielsen,et al.  Noncommand user interfaces , 1993, CACM.

[17]  Jock D. Mackinlay,et al.  The information visualizer, an information workspace , 1991, CHI.

[18]  Colin Ware,et al.  Evaluating stereo and motion cues for visualizing information nets in three dimensions , 1996, TOGS.

[19]  Anthony V. W. Smith,et al.  Real-time facial expression recognition based on features' positions and dimensions , 1996, Proceedings of 13th International Conference on Pattern Recognition.

[20]  Ben Shneiderman,et al.  Designing the User Interface: Strategies for Effective Human-Computer Interaction , 1998 .