Automation and calibration for robot vision systems

General-purpose robot vision includes a number of different tasks that impose a great variety of imaging conditions and requirements. To support the full range of these tasks, an imaging system must provide a very wide dynamic range and high precision in both geometric and radiometric characteristics. In general, this can only be accomplished by a highly precise, automated imaging system. This paper defines a twelve-parameter model for a robot imaging system six parameters in camera position, three in optical constraints, and three in sensitivity that subsumes common TV cameras and "scientific" cameras as special cases. We call this model the "Imaging Space", a configuration space for robot imaging systems. Systematic consideration of this complete model leads to a more comprehensive treatment of camera calibration than has been seen before. While traditional calibration literature refers only to geometric calibration of the imaging system, the new model uses similar concepts to outline the radiometric (pixel value) calibration of the system. The concept of "second-order calibration" is also introduced, in which the interaction of geometry and radiometry is explicitly accounted for. Representing this second-order calibration data in a useful form is not yet a solved problem. We also outline some of the issues in specifying imaging constraints in a task-oriented way. This paper is not primarily a report of research results; instead, it is a report on the state of the art in imaging system technology and calibration and an outline of some future directions for work in this area. The emphasis throughout is on the achievement of wide dynamic range, i.e. high precision in geometry and radiometry, because modern theories for robot vision are showing a direct link between the precision of the imaging system and the precision in computed quantities such as object shape.

[1]  Peter Kovesi,et al.  Automatic Sensor Placement from Vision Task Requirements , 1988, IEEE Trans. Pattern Anal. Mach. Intell..

[2]  Robert B. Kelley,et al.  Camera Models Based on Data from Two Calibration Planes , 1981 .

[3]  Rudolf Kingslake,et al.  Lens Design Fundamentals , 1978 .

[4]  Yoram Yakimovsky,et al.  A system for extracting three-dimensional measurements from a stereo pair of TV cameras , 1976 .

[5]  O.Robert Mitchell,et al.  Recent Results in Precision Measurements of Edges, Angles, Areas and Perimeters , 1987, Other Conferences.

[6]  Berthold K. P. Horn,et al.  Determining Optical Flow , 1981, Other Conferences.

[7]  Irwin Sobel,et al.  On Calibrating Computer Controlled Cameras for Perceiving 3-D Scenes , 1973, IJCAI.

[8]  Takeo Kanade,et al.  Gradient space under orthography and perspective , 1982, Comput. Vis. Graph. Image Process..

[9]  D. R. Lamb,et al.  Charge-coupled devices and their applications , 1980 .

[10]  W. N. Sproson,et al.  Colour Science in Television and Display Systems , 1983 .

[11]  John F. Canny,et al.  A Computational Approach to Edge Detection , 1986, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[12]  Donald Bernard Gennery,et al.  Modelling the environment of an exploring vehicle by means of stereo vision , 1980 .

[13]  Pearl Pu,et al.  A new development in camera calibration calibrating a pair of mobile cameras , 1985, Proceedings. 1985 IEEE International Conference on Robotics and Automation.

[14]  Takeo Kanade,et al.  Geometric camera calibration using systems of linear equations , 1988, Proceedings. 1988 IEEE International Conference on Robotics and Automation.

[15]  Alex Pentland,et al.  A New Sense for Depth of Field , 1985, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[16]  A. Sanders Optical radiation measurements , 1985 .

[17]  Thomas O. Binford,et al.  Local shape from specularity , 1988, Comput. Vis. Graph. Image Process..

[18]  R. Y. Tsai,et al.  An Efficient and Accurate Camera Calibration Technique for 3D Machine Vision , 1986, CVPR 1986.