Human-robot interface by pointing with uncalibrated stereo vision

Here we present the results of an investigation into the use of a pointing-based interface for robot guidance. The system requires no physical contact with the operator, but uses uncalibrated stereo vision with active contours to track the position and pointing direction of a hand in real time. With a ground plane constraint, it is possible to find the indicated position in the robot's workspace, by considering only two-dimensional collineations. Experimental and simulation data show that a resolution of within 1 cm can be achieved in a 40 cm workspace, allowing simple pick-and-place operations to be specified by finger pointing.

[1]  Charles Kervrann,et al.  Robust tracking of stochastic deformable models in long image sequences , 1994, Proceedings of 1st International Conference on Image Processing.

[2]  Michel Beaudouin-Lafon,et al.  Charade: remote control of objects using free-hand gestures , 1993, CACM.

[3]  Thomas S. Huang,et al.  Virtual Gun, A Vision Based Human Computer Interface Using the Human Hand , 1994, MVA.

[4]  Chris Harris,et al.  Tracking with rigid models , 1993 .

[5]  Andrew Blake,et al.  Affine-invariant contour tracking with automatic control of spatiotemporal scale , 1993, 1993 (4th) International Conference on Computer Vision.

[6]  Alex Pentland,et al.  Space-time gestures , 1993, Proceedings of IEEE Conference on Computer Vision and Pattern Recognition.

[7]  M. Carter Computer graphics: Principles and practice , 1997 .

[8]  Roberto Cipolla,et al.  Robust structure from motion using motion parallax , 1993, 1993 (4th) International Conference on Computer Vision.

[9]  Steven D. Pieper,et al.  Hands-on interaction with virtual environments , 1989, UIST '89.

[10]  Andrew Zisserman,et al.  Geometric invariance in computer vision , 1992 .

[11]  S. Ullman,et al.  The interpretation of visual motion , 1977 .

[12]  Roberto Cipolla,et al.  Uncalibrated stereo hand-eye coordination , 1994, Image Vis. Comput..