Camera pan/tilt to eliminate the workspace-size/pixel-resolution tradeoff with camera–space manipulation

Abstract The successful implementation of close-tolerance, three-dimensional rigid body assembly has been robustly achieved using camera-space manipulation in a limited region of the manipulator's workspace. The extension of this capability to a broader region can in general be achieved by mounting the cameras on computer-controlled platforms or “pan/tilt” units. The use of this type of platform enables the encompassing of a large physical region within the fields of view of the cameras, while preserving an approximately constant image-plane resolution per unit physical space. The paper describes the derivations involved in the determination of view parameters when the information of the angles of pan/tilt rotation of the cameras is available. Such procedure enables adequate parameter observability with a greatly reduced sampling in terms of number and breadth. Practical considerations for the implementation of this capability for a high-precision, three-dimensional task across a large workspace region are also presented.

[1]  Emilio J. González-Galván,et al.  Efficient camera-space manipulation using moments , 1996, Proceedings of IEEE International Conference on Robotics and Automation.

[2]  Emilio J. González-Galván,et al.  Efficient Camera-Space Target Disposition in a Matrix of Moments Structure Using Camera-Space Manipulation , 1999, Int. J. Robotics Res..

[3]  Umesh A. Korde,et al.  Three-dimensional camera-space manipulation using servoable cameras , 1992, Other Conferences.

[4]  Peter K. Allen,et al.  Real-time visual servoing , 1991, Proceedings. 1991 IEEE International Conference on Robotics and Automation.

[5]  Pradeep K. Khosla,et al.  Strategies for Increasing the Tracking Region of an Eye-in-Hand System by Singularity and Joint Limit Avoidance , 1993, [1993] Proceedings IEEE International Conference on Robotics and Automation.

[6]  Steven B. Skaar,et al.  Three-Dimensional Camera Space Manipulation , 1990, Int. J. Robotics Res..

[7]  Emilio J. González-Galván,et al.  Towards a robotic plasma spraying operation using vision , 1998, IEEE Robotics Autom. Mag..

[8]  Peter I. Corke,et al.  A tutorial on visual servo control , 1996, IEEE Trans. Robotics Autom..

[9]  Fadi Dornaika,et al.  Visually guided object grasping , 1998, IEEE Trans. Robotics Autom..

[10]  Gregory D. Hager,et al.  The confluence of vision and control, Block Island Workshop on Vision and Control, June 23-27, 1997, Block Island, Rhode Island, USA , 1998, Block Island Workshop on Vision and Control.

[11]  Umesh A. Korde,et al.  Application of a Precision-Enhancing Measure in 3D Rigid-Body Positioning Using Camera-Space Manipulation , 1997, Int. J. Robotics Res..

[12]  Koichi Hashimoto,et al.  Visual Servoing: Real-Time Control of Robot Manipulators Based on Visual Sensory Feedback , 1993 .

[13]  Pradeep K. Khosla,et al.  Integrating Sensor Placement and Visual Tracking Strategies , 1993, ISER.

[14]  John L. Junkins,et al.  An introduction to optimal estimation of dynamical systems , 1978 .