Dynamic Sensor Placement Using Controlled Active Vision

Abstract The working region of a camera providing visual feedback to guide manipulation tasks can be greatly extended if the camera can move in real-time. Proper camera motion depends on several different criteria, such as object motion, maximum manipulator tracking speed, manipulator configuration, camera depth-of-field, field-of-view, spatial resolution, occlusion avoidance, etc. In this paper, the controlled active vision framework is extended to include dynamically determined sensor placement criteria. The criteria considered include focus, spatial resolution, and manipulator configuration, as well as object motion. Experimental results are presented.

[1]  Roger Y. Tsai,et al.  Automated sensor planning for robotic vision tasks , 1991, Proceedings. 1991 IEEE International Conference on Robotics and Automation.

[2]  Tsuneo Yoshikawa,et al.  Manipulability of Robotic Mechanisms , 1985 .

[3]  Pradeep K. Khosla,et al.  The Chimera II real-time operating system for advanced sensor-based control applications , 1992, IEEE Trans. Syst. Man Cybern..

[4]  Bradley J. Nelson,et al.  Monocular 3-D visual tracking of a moving target by an eye-in-hand robotic system , 1992, [1992] Proceedings of the 31st IEEE Conference on Decision and Control.

[5]  Robert M. Haralick,et al.  Automatic sensor and light source positioning for machine vision , 1990, [1990] Proceedings. 10th International Conference on Pattern Recognition.

[6]  Pradeep K. Khosla,et al.  Strategies for Increasing the Tracking Region of an Eye-in-Hand System by Singularity and Joint Limit Avoidance , 1993, [1993] Proceedings IEEE International Conference on Robotics and Automation.

[7]  P. Anandan Measuring Visual Motion From Image Sequences , 1987 .

[8]  P. K. Khosla,et al.  Adaptive Robotic Visual Tracking , 1991, 1991 American Control Conference.

[9]  Aviv Bergman,et al.  Determining the camera and light source location for a visual task , 1989, Proceedings, 1989 International Conference on Robotics and Automation.