Gesture VR: vision-based 3D hand interace for spatial interaction

It% dwctibe u nouel malti-dimensionalhand ges-Pdre inier~ace system and iti use in interactive spatial erpplications. The system squires input Jtrtafrom ttto cameras that look at user's hand, ~ecogrrizes three gesti~res and tracks the hand in SD space. at the rate of 60 Rz. Five spatial parameters (position and orientation in 3D) are compatedjor indu finger and the thumb, which gives the user a simultaneous control of up to ten parameters of an application. 11'edesctibe some of the applications that have 6e&ncon:tircted to dernonstiate the capabilities of this s~rstern. They include an interface to a video game, piloting a virtual fig-through over temain bg hand pointing, interacting with a 3D scene editor by 'grasping " and moving objects in space, and a portial wntrol of an articulated model of a human hand. The gesture interface makes the control of fhese application very in-taitive, and simpler than using the cument input devices. 1 !ntrodrtctian \%~u~ output interfac~ that are better than those shown in the movie Jur~=sic Park are now feasible tith home computers. However, their ~ uttiation in mtitirnedia systems is b-ited by the the input devices that do not have enough degrees of freedom, or those whid con-tie the mser by a need to wear a #eve. The tetiology that promises resolution of this bot-tlene& is optia. Using video cameras to cap ture hand images can protide adequate number of parameters for a dfitrons interface and untie the user horn medanicd devices. kvmti-gation of optical input te&olo~ started over 25 years ago tith the work of hlyron Krueger [8, 9], and recently there has been grotig in-terc~t in visson based gesture recognition inter-Perrnission 10makedigit~orhardcopiesof dl orpartof IhISv:ork for personalor cl~room ase is granted v:!thoutfee provided that copies are nol made or dlstniuledfor profitor commertidadvantage, andthat copiesbearlhis notimandthefull citationon thefirst page.To copyolhenvise, to republiskto postonsen,ers or to redistribute10 lists, requires prior specific permission andora fe~ AChl hfultimedia'9S. Bristol, UK 0 199SAChl 1-5S1 l>03&S~9S!OOOS S5.00 For example, hfaggioni [10] describ= a system that recognizes &x static hand gesture and detects the position of the pahn in 3D. Kjeldsen and Kender [5, 6] destibe a system that uses hand tracking and recognition to replace the mouse for moving and rtig \vindows. Ho~vever, to make this technology practical, more work ~fl be needed on speti~ed image processing methods and refl-titne systems. This paper describes GestureVR, a novel video based hand gesture recognition interface system …

[1]  Philip R. Cohen,et al.  QuickSet: multimodal interaction for distributed applications , 1997, MULTIMEDIA '97.

[2]  John R. Kender,et al.  Toward the use of gesture in traditional user interfaces , 1996, Proceedings of the Second International Conference on Automatic Face and Gesture Recognition.

[3]  Myron W. Krueger,et al.  Artificial reality II , 1991 .

[4]  David Weimer,et al.  Interaction techniques using hand tracking and speech recognition , 1992 .

[5]  Hiroaki Nishino,et al.  Interactive two-handed gesture interface in 3D virtual environments , 1997, VRST '97.

[6]  Song Han,et al.  3DSketch: modeling by digitizing with a smart 3D pen , 1997, MULTIMEDIA '97.

[7]  Rangachar Kasturi,et al.  Machine vision , 1995 .

[8]  Pierre Wellner The DigitalDesk calculator: tangible manipulation on a desk top display , 1991, UIST '91.

[9]  Markus Kohler System architecture and techniques for gesture recognition in unconstrained environments , 1997, Proceedings. International Conference on Virtual Systems and MultiMedia VSMM '97 (Cat. No.97TB100182).