Toward Spontaneous Interaction with the Perceptive Workbench 1

ers mostly by using wire-based devices. Typically, the wires limit the distance of movement and inhibit freedom of orientation. In addition, most interactions are indirect. The user moves a device as an analog for the action created in the display space. We envision an untethered interface that accepts gestures directly and can accept any objects we choose as interactors. In conventional 3D interaction, the devices that track position and orientation are still usually tethered to the machine by wires. Devices, such as pinch gloves, that permit the user to experience a more natural interface often don’t perform as well. Users generally prefer simple handheld devices with buttons. Pinch gloves carry assumptions about the position of the user’s hand and fingers with respect to the tracker. Of course, users’ hands differ in size and shape, so the assumed tracker position must be recalibrated for each user. This is hardly ever done. Also, the glove interface causes subtle changes to recognized hand gestures. The result is that fine manipulations can be imprecise, and the user comes away with the feeling that the interaction is slightly off in an indeterminate way. If we can recognize gestures directly, we take into account the difference in hand sizes and shapes. An additional problem is that any device held in the hand can become awkward while gesturing. We’ve found this even with a simple pointing device, such as a stick with a few buttons (see Figure 1). Also, unless fairly skilled, a user often has to pause to identify and select buttons on the stick. With accurately tracked hands, most of this awkwardness disappears. We’re adept at pointing in almost any direction and can quickly pinch fingers, for example, without looking at them. Finally, physical objects are often natural interactors (such as phicons). However, with current systems these objects must be inserted in advance or specially prepared. We’d like the system to accept objects that we choose spontaneously for interaction. In this article we discuss methods for producing more seamless interaction between the physical and virtual environments through the Perceptive Workbench. We applied the system to an augmented reality game and a terrain navigating system. The Perceptive Workbench can reconstruct 3D virtual representations of previously unseen real-world objects placed on its surface. In addition, the Perceptive Workbench identifies and tracks such objects as they’re manipulated on the desk’s surface and allows the user to interact with the augmented environment through 2D and 3D gestures. These gestures can be made on the plane of the desk’s surface or in the 3D space above the desk. Taking its cue from the user’s actions, the Perceptive Workbench switches between these modes automatically. Computer vision controls all interaction, freeing the user from the wires of traditional sensing techniques.

[1]  Narendra Ahuja,et al.  ALGORITHM FOR GENERATING OCTREES FROM OBJECT SILHOUETTES IN PERSPECTIVE VIEWS. , 1987 .

[2]  Myron W. Krueger,et al.  Artificial reality II , 1991 .

[3]  Richard A. Bolt,et al.  Two-handed gesture in multi-modal natural dialog , 1992, UIST '92.

[4]  Pierre David Wellner,et al.  Interacting with paper on the DigitalDesk , 1993, CACM.

[5]  Bernd Fröhlich,et al.  The Responsive Workbench [virtual work environment] , 1994, IEEE Computer Graphics and Applications.

[6]  Takeo Kanade,et al.  Visual Tracking of High DOF Articulated Structures: an Application to Human Hand Tracking , 1994, ECCV.

[7]  Bernd Fröhlich,et al.  The Responsive Workbench: A Virtual Work Environment , 1995, Computer.

[8]  Geoff A. W. West,et al.  Nonparametric Segmentation of Curves into Various Representations , 1995, IEEE Trans. Pattern Anal. Mach. Intell..

[9]  Hiroshi Ishii,et al.  Bricks: laying the foundations for graspable user interfaces , 1995, CHI '95.

[10]  William Ribarsky,et al.  Real-time, continuous level of detail rendering of height fields , 1996, SIGGRAPH.

[11]  Alex Pentland,et al.  Pfinder: real-time tracking of the human body , 1996, Proceedings of the Second International Conference on Automatic Face and Gesture Recognition.

[12]  Hiroshi Ishii,et al.  The metaDESK: models and prototypes for tangible user interfaces , 1997, UIST '97.

[13]  Hiroshi Ishii,et al.  Tangible bits: towards seamless interfaces between people, bits and atoms , 1997, CHI.

[14]  Aldo Laurentini How Many 2D Silhouettes Does It Take to Reconstruct a 3D Object? , 1997, Comput. Vis. Image Underst..

[15]  Rajeev Sharma,et al.  Computer Vision-Based Augmented Reality for Guiding Manual Assembly , 1997, Presence: Teleoperators & Virtual Environments.

[16]  Jun Rekimoto,et al.  HoloWall: designing a finger, hand, body, and object sensitive wall , 1997, UIST '97.

[17]  Hiroshi Ishii,et al.  Illuminating light: an optical design tool with a luminous-tangible interface , 1998, CHI.

[18]  William Ribarsky,et al.  Effects of Variation in System Responsiveness on User Performance in Virtual Environments , 1998, Hum. Factors.

[19]  Jean Ponce,et al.  Automatic model construction, pose estimation, and object recognition from photographs using triangular splines , 1998, Sixth International Conference on Computer Vision (IEEE Cat. No.98CH36271).

[20]  William Ribarsky,et al.  Interaction in Semi-Immersiv e Large Display Environments , 1998 .

[21]  Alex Pentland,et al.  Real-Time American Sign Language Recognition Using Desk and Wearable Computer Based Video , 1998, IEEE Trans. Pattern Anal. Mach. Intell..

[22]  Gregory Dudek,et al.  On 3-D surface reconstruction using shape from shadows , 1998, Proceedings. 1998 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (Cat. No.98CB36231).

[23]  Paolo Cignoni,et al.  Metro: Measuring Error on Simplified Surfaces , 1998, Comput. Graph. Forum.

[24]  William Ribarsky,et al.  Third-person navigation of whole-planet terrain in a head-tracked stereoscopic environment , 1999, Proceedings IEEE Virtual Reality (Cat. No. 99CB36316).

[25]  Oliver Bimber,et al.  Gesture Controlled Object Interaction: A Virtual Table Case-Study , 1999 .

[26]  William Ribarsky,et al.  Multimodal interaction techniques for the virtual workbench , 1999, CHI EA '99.

[27]  Bastian Leibe,et al.  MIND-WARPING: towards creating a compelling collaborative augmented reality game , 2000, IUI '00.

[28]  Doug A. Bowman,et al.  The Simple Virtual Environment Library: An Extensible Framework for Building VE Applications , 2000, Presence: Teleoperators & Virtual Environments.