Herein, we present an unique vision based computer interface entitled Virtual Gun. In Virtual Gun, a person sits at a computer and points his index finger toward the screen with his thumb pointing up (similar to using one's hand as a gun). Movement of the hand and index finger moves the cursor on the screen. In order to click down on the cursor, one gestures as if he were shooting a gun (e.g., bringing his thumb down to the palm). Releasing the mouse button is equivalent to bringing the thumb back up and away from the palm. The uniqueness in the presented tracking system is in the use of the entire 3-D hand model. This method is in contrast to tracking methods which use only a set of features of the model such as finger edges and tips or other methods which use an internal representation of the hand as is done in neural networks. In addition, there is no need for the user to wear a special glove or other physical items, which allows complete freedom to the user.
[1]
Takeo Kanade,et al.
DigitEyes: Vision-Based Human Hand Tracking
,
1993
.
[2]
T. S. Huang,et al.
Human computer interaction via the human hand: a hand model
,
1994,
Proceedings of 1994 28th Asilomar Conference on Signals, Systems and Computers.
[3]
James W. Davis,et al.
GESTURE RECOGNITION
,
2023,
International Research Journal of Modernization in Engineering Technology and Science.
[4]
Alex Pentland,et al.
Space-time gestures
,
1993,
Proceedings of IEEE Conference on Computer Vision and Pattern Recognition.
[5]
Tosiyasu L. Kunii,et al.
Constraint-Based Hand Animation
,
1993
.
[6]
Steven D. Pieper,et al.
Hands-on interaction with virtual environments
,
1989,
UIST '89.