Finger tracking for interaction in augmented environments

Optical tracking systems allow three-dimensional input for virtual environment applications with high precision and without annoying cables. Spontaneous and intuitive interaction is possible through gestures. The authors present a finger tracker that allows gestural interaction and is simple, cheap, fast, robust against occlusion and accurate. It is based on a marked glove, a stereoscopic tracking system and a kinematic 3D model of the human finger. Within our augmented reality application scenario, the user is able to grab, translate, rotate, and release objects in an intuitive way. We demonstrate our tracking system in an augmented reality chess game, allowing a user to interact with virtual objects.

[1]  Thomas Brown,et al.  Finger tracking for the Digital Desk , 2000, Proceedings First Australasian User Interface Conference. AUIC 2000 (Cat. No.PR00515).

[2]  Alex Pentland,et al.  Pfinder: real-time tracking of the human body , 1996, Proceedings of the Second International Conference on Automatic Face and Gesture Recognition.

[3]  Richard A. Bolt,et al.  “Put-that-there”: Voice and gesture at the graphics interface , 1980, SIGGRAPH '80.

[4]  Jean-Yves Hervé,et al.  Visual hand posture tracking in a gripper guiding application , 2000, Proceedings 2000 ICRA. Millennium Conference. IEEE International Conference on Robotics and Automation. Symposia Proceedings (Cat. No.00CH37065).

[5]  Greg Welch,et al.  Table-top spatially-augmented realty: bringing physical models to life with projected imagery , 1999, Proceedings 2nd IEEE and ACM International Workshop on Augmented Reality (IWAR'99).

[6]  Fumio Kishino,et al.  Real time hand gesture recognition using 3D prediction model , 1993, Proceedings of IEEE Systems Man and Cybernetics Conference - SMC.

[7]  Alex Pentland,et al.  Pfinder: Real-Time Tracking of the Human Body , 1997, IEEE Trans. Pattern Anal. Mach. Intell..

[8]  Nadia Magnenat-Thalmann,et al.  Modelling and Motion Capture Techniques for Virtual Environments , 1998, Lecture Notes in Computer Science.

[9]  Mary C. Whitton,et al.  Walking > walking-in-place > flying, in virtual environments , 1999, SIGGRAPH.

[10]  A BoltRichard,et al.  Put-that-there , 1980 .

[11]  Christoph Maggioni,et al.  A novel gestural input device for virtual reality , 1993, Proceedings of IEEE Virtual Reality Annual International Symposium.

[12]  Mubarak Shah,et al.  A virtual 3D blackboard: 3D finger tracking using a single camera , 2000, Proceedings Fourth IEEE International Conference on Automatic Face and Gesture Recognition (Cat. No. PR00580).

[13]  P PentlandAlex,et al.  Recursive Estimation of Motion, Structure, and Focal Length , 1995 .

[14]  Yoichi Sato,et al.  Fast tracking of hands and fingertips in infrared images for augmented desk interface , 2000, Proceedings Fourth IEEE International Conference on Automatic Face and Gesture Recognition (Cat. No. PR00580).

[15]  Bernd Fröhlich,et al.  The Responsive Workbench: A Virtual Work Environment , 1995, Computer.

[16]  Roberto Cipolla,et al.  Human-robot interface by pointing with uncalibrated stereo vision , 1996, Image Vis. Comput..

[17]  Alexander Zelinsky,et al.  Visual gesture interfaces for virtual environments , 2000, Proceedings First Australasian User Interface Conference. AUIC 2000 (Cat. No.PR00515).

[18]  Colin Ware,et al.  Exploration and virtual camera control in virtual three dimensional environments , 1990, I3D '90.

[19]  David C. Hogg,et al.  Towards 3D hand tracking using a deformable model , 1996, Proceedings of the Second International Conference on Automatic Face and Gesture Recognition.

[20]  Bernd Fröhlich,et al.  The Responsive Workbench [virtual work environment] , 1994, IEEE Computer Graphics and Applications.

[21]  Jakub Segen,et al.  Human-computer interaction using gesture recognition and 3D hand tracking , 1998, Proceedings 1998 International Conference on Image Processing. ICIP98 (Cat. No.98CB36269).

[22]  Chris Hand,et al.  A Survey of 3D Interaction Techniques , 1997, Comput. Graph. Forum.

[23]  Vladimir Pavlovic,et al.  Visual Interpretation of Hand Gestures for Human-Computer Interaction: A Review , 1997, IEEE Trans. Pattern Anal. Mach. Intell..

[24]  I.-J. Ko,et al.  Extracting the hand region with the aid of a tracking facility , 1996 .

[25]  Takeo Kanade,et al.  DigitEyes: Vision-Based Human Hand Tracking , 1993 .

[26]  Klaus Dorfmüller,et al.  Real-Time Hand and Head Tracking for Virtual Environments Using Infrared Beacons , 1998, CAPTECH.

[27]  Akira Utsumi,et al.  Multiple-hand-gesture tracking using multiple cameras , 1999, Proceedings. 1999 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (Cat. No PR00149).

[28]  Klaus Dorfmüller An Optical Tracking System for VR/AR-Applications , 1999, EGVE.

[29]  C. Jennings,et al.  Robust finger tracking with multiple cameras , 1999, Proceedings International Workshop on Recognition, Analysis, and Tracking of Faces and Gestures in Real-Time Systems. In Conjunction with ICCV'99 (Cat. No.PR00378).

[30]  Michael Isard,et al.  3D position, attitude and shape input using video tracking of hands and lips , 1994, SIGGRAPH.

[31]  Roberto Cipolla,et al.  Robust structure from motion using motion parallax , 1993, 1993 (4th) International Conference on Computer Vision.

[32]  Francis K. H. Quek Eyes in the interface , 1995, Image Vis. Comput..

[33]  Francis K. H. Quek,et al.  Toward a vision-based hand gesture interface , 1994 .

[34]  Dieter Schmalstieg,et al.  Bridging multiple user interface dimensions with augmented reality , 2000, Proceedings IEEE and ACM International Symposium on Augmented Reality (ISAR 2000).

[35]  Thomas S. Huang,et al.  Vision based hand modeling and tracking for virtual teleconferencing and telecollaboration , 1995, Proceedings of IEEE International Conference on Computer Vision.