VizWear: Toward Human-Centered Interaction through Wearable Vision and Visualization

In this paper, we discuss the development of wearable systems which we collectively term VizWear. Vision plays an important role in both people's and computers' understanding of contexual information, and the use of augmented reality (AR) techniques is a good way to show information intuitively. This is the basis of our research on wearable computer vision and visualization systems. Our wearable systems enable us to run different vision tasks in real-time. We describe a novel approach not only to sensing the wearer's position and direction, but also to displaying video frames overlaid with 2-D annotations related to the wearer's view. We have also developed a method for 3-D graphical overlay by applying object recognition techniques and the Hand Mouse, which enables the wearer to interact directly with an AR environment. We also describe an efficient method of face registration using wearable active vision.

[1]  Katsuhiko Sakaue,et al.  A panorama-based method of personal positioning and orientation and its real-time applications for wearable computers , 2001, Proceedings Fifth International Symposium on Wearable Computers.

[2]  Keinosuke Fukunaga,et al.  Introduction to statistical pattern recognition (2nd ed.) , 1990 .

[3]  M. Turk,et al.  Eigenfaces for Recognition , 1991, Journal of Cognitive Neuroscience.

[4]  Stanley T. Birchfield,et al.  Elliptical head tracking using intensity gradients and color histograms , 1998, Proceedings. 1998 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (Cat. No.98CB36231).

[5]  Katsuhiko Sakaue,et al.  Real-Time Camera Parameter Estimation from Images for a Wearable Vision System , 2000, MVA.

[6]  Jonny Farringdon,et al.  Visual Augmented Memory (VAM) , 2000, Digest of Papers. Fourth International Symposium on Wearable Computers.

[7]  Yoichi Muraoka,et al.  A panorama-based technique for annotation overlay and its real-time implementation , 2000, 2000 IEEE International Conference on Multimedia and Expo. ICME2000. Proceedings. Latest Advances in the Fast Changing World of Multimedia (Cat. No.00TH8532).

[8]  Katsuhiko Sakaue,et al.  3-D annotation of images captured from a wearer's camera based on object recognition , 2001 .

[9]  Jun Fujiki,et al.  A fast and robust approach to recovering structure and motion from live video frames , 2000, Proceedings IEEE Conference on Computer Vision and Pattern Recognition. CVPR 2000 (Cat. No.PR00662).

[10]  Takeshi Shakunaga,et al.  Cooperative Distributed Registration for Face Recognition in Natural Environments , 2000 .

[11]  Jennifer Healey,et al.  Augmented Reality through Wearable Computing , 1997, Presence: Teleoperators & Virtual Environments.

[12]  Katsuhiko Sakaue,et al.  The Hand Mouse: GMM hand-color classification and mean shift tracking , 2001, Proceedings IEEE ICCV Workshop on Recognition, Analysis, and Tracking of Faces and Gestures in Real-Time Systems.

[13]  Alexander H. Waibel,et al.  Segmenting hands of arbitrary color , 2000, Proceedings Fourth IEEE International Conference on Automatic Face and Gesture Recognition (Cat. No. PR00580).

[14]  Yoichi Muraoka,et al.  Improvement of panorama-based annotation overlay using omnidirectional vision and inertial sensors , 2000, Digest of Papers. Fourth International Symposium on Wearable Computers.

[15]  Keinosuke Fukunaga,et al.  Introduction to Statistical Pattern Recognition , 1972 .

[16]  M. Lamming,et al.  "Forget-me-not" Intimate Computing in Support of Human Memory , 1994 .

[17]  David W. Murray,et al.  Wearable visual robots , 2000, Digest of Papers. Fourth International Symposium on Wearable Computers.

[18]  Steve Mann,et al.  Wearable Computing: A First Step Toward Personal Imaging , 1997, Computer.