A handle bar metaphor for virtual object manipulation with mid-air interaction

Commercial 3D scene acquisition systems such as the Microsoft Kinect sensor can reduce the cost barrier of realizing mid-air interaction. However, since it can only sense hand position but not hand orientation robustly, current mid-air interaction methods for 3D virtual object manipulation often require contextual and mode switching to perform translation, rotation, and scaling, thus preventing natural continuous gestural interactions. A novel handle bar metaphor is proposed as an effective visual control metaphor between the user's hand gestures and the corresponding virtual object manipulation operations. It mimics a familiar situation of handling objects that are skewered with a bimanual handle bar. The use of relative 3D motion of the two hands to design the mid-air interaction allows us to provide precise controllability despite the Kinect sensor's low image resolution. A comprehensive repertoire of 3D manipulation operations is proposed to manipulate single objects, perform fast constrained rotation, and pack/align multiple objects along a line. Three user studies were devised to demonstrate the efficacy and intuitiveness of the proposed interaction techniques on different virtual manipulation scenarios.

[1]  Rod McCall,et al.  Lightweight palm and finger tracking for real-time 3D gesture control , 2011, 2011 IEEE Virtual Reality Conference.

[2]  Andrea Giachetti,et al.  A Practical Vision Based Approach to Unencumbered Direct Spatial Manipulation in Virtual Worlds , 2007, Eurographics Italian Chapter Conference.

[3]  Y. Guiard Asymmetric division of labor in human skilled bimanual action: the kinematic chain as a model. , 1987, Journal of motor behavior.

[4]  Michel Beaudouin-Lafon,et al.  Charade: remote control of objects using free-hand gestures , 1993, CACM.

[5]  Bernd Fröhlich,et al.  A soft hand model for physically-based manipulation of virtual objects , 2011, 2011 IEEE Virtual Reality Conference.

[6]  Jakub Segen,et al.  Gesture VR: vision-based 3D hand interace for spatial interaction , 1998, MULTIMEDIA '98.

[7]  Holger Regenbrecht,et al.  Real-time Volumetric Reconstruction and Tracking of Hands and Face as a User Interface for Virtual Environments , 2009, 2009 IEEE Virtual Reality Conference.

[8]  A. Raposo,et al.  Direct 3 D Manipulation Using Vision-Based Recognition of Uninstrumented Hands , 2008 .

[9]  Hrvoje Benko,et al.  Multi-point interactions with immersive omnidirectional visualizations in a dome , 2010, ITS '10.

[10]  Ricardo Jota,et al.  Constructing virtual 3D models with physical building blocks , 2011, CHI EA '11.

[11]  Yoichi Sato,et al.  Real-time input of 3D pose and gestures of a user's hand and its applications for HCI , 2001, Proceedings IEEE Virtual Reality 2001.

[12]  Tovi Grossman,et al.  Medusa: a proximity-aware multi-touch tabletop , 2011, UIST.

[13]  Du-Sik Park,et al.  3D user interface combining gaze and hand gestures for large-scale display , 2010, CHI EA '10.

[14]  Xun Luo,et al.  Scalable Vision-based Gesture Interaction for Cluster-driven High Resolution Display Systems , 2009, 2009 IEEE Virtual Reality Conference.

[15]  Marielle Mokhtari,et al.  Bimanual gestural interface for virtual environments , 2011, 2011 IEEE Virtual Reality Conference.

[16]  Dennis Proffitt,et al.  Two-handed virtual manipulation , 1998, TCHI.

[17]  Alexander Zelinsky,et al.  Visual gesture interfaces for virtual environments , 2002, Interact. Comput..

[18]  Anatole Lécuyer,et al.  SkeweR: a 3D Interaction Technique for 2-User Collaborative Manipulation of Objects in Virtual Environments , 2006, 3D User Interfaces (3DUI'06).

[19]  Wang Hui-nan Multi-Finger Gestural Interaction with 3D Volumetric Displays , 2008 .

[20]  Ben Horan,et al.  Grasping virtual objects with multi-point haptics , 2011, 2011 IEEE Virtual Reality Conference.

[21]  Ivan Poupyrev,et al.  3D User Interfaces: Theory and Practice , 2004 .

[22]  Olivier Chapuis,et al.  Mid-air pan-and-zoom on wall-sized displays , 2011, CHI.

[23]  Andreas Butz,et al.  Interactions in the air: adding further depth to interactive tabletops , 2009, UIST '09.

[24]  Hiroshi Ishii,et al.  g-stalt: a chirocentric, spatiotemporal, and telekinetic gestural interface , 2010, TEI '10.

[25]  Hrvoje Benko,et al.  Using Depth-Sensing Camera to Enable Freehand Interactions On and Above the Interactive Surface , 2008 .

[26]  Daniel J. Wigdor,et al.  Combining and measuring the benefits of bimanual pen and direct-touch interaction on horizontal interfaces , 2008, AVI '08.

[27]  Sylvain Paris,et al.  6D hands: markerless hand-tracking for computer aided design , 2011, UIST.

[28]  Bernd Fröhlich,et al.  Two-handed direct manipulation on the responsive workbench , 1997, SI3D.