Among gestures naturally performed by users during communication, pointing gestures can be easily recognized and included in more natural new Human Computer Interfaces. We approximate the eye-finger pointing direction of a user by detecting and tracking, in real time, the 3D positions of the centre of the face and of both hands; the positions are obtained by a stereoscopic device located on the top of the display. From the head position and biometric constraints, we define both a rest area and an action area. In this former area, the hands are searched for and the pointing intention is detected. The first hand spontaneously moved forward by the user is defined as the pointing hand whereas the second detected hand, when it first moves forwards, is considered as the selection hand. Experiments on spatial precision, carried out with a group of users, show that the minimum size of an object to be easily pointed at is some 1.5 percent of the diagonal of the large display.
[1]
Rainer Stiefelhagen,et al.
Pointing gesture recognition based on 3D-tracking of face, hands and head orientation
,
2003,
ICMI '03.
[2]
Nebojsa Jojic,et al.
Detection and estimation of pointing gestures in dense disparity maps
,
2000,
Proceedings Fourth IEEE International Conference on Automatic Face and Gesture Recognition (Cat. No. PR00580).
[3]
Trevor Darrell,et al.
3-D articulated pose tracking for untethered diectic reference
,
2002,
Proceedings. Fourth IEEE International Conference on Multimodal Interfaces.
[4]
Raphaël Féraud,et al.
A Fast and Accurate Face Detector Based on Neural Networks
,
2001,
IEEE Trans. Pattern Anal. Mach. Intell..