Detection and tracking for robotic visual servoing systems

Abstract Robot manipulators require knowledge about their environment in order to perform their desired actions. In several robotic tasks, vision sensors play a critical role by providing the necessary quantity and quality of information regarding the robot's environment. For example, “visual servoing” algorithms may control a robot manipulator in order to track moving objects that are being imaged by a camera. Current visual servoing systems often lack the ability to detect automatically objects that appear within the camera's field of view. In this research, we present a robust “figureiground” framework for visually detecting objects of interest. An important contribution of this research is a collection of optimization schemes that allow the detection framework to operate within the real-time limits of visual servoing systems. The most significant of these schemes involves the use of “spontaneous” and “continuous” domains. The number and location of continuous domains are. allowed to change over time, adjusting to the dynamic conditions of the detection process. We have developed actual servoing systems in order to test the framework's feasibility and to demonstrate its usefulness for visually controlling a robot manipulator.

[1]  P. J. Burt,et al.  Change Detection and Tracking Using Pyramid Transform Techniques , 1985, Other Conferences.

[2]  S. Ullman,et al.  The interpretation of visual motion , 1977 .

[3]  Christopher M. Brown,et al.  Centralized and decentralized Kalman filter techniques for navigation, and control , 1989 .

[4]  Berthold K. P. Horn Robot vision , 1986, MIT electrical engineering and computer science series.

[5]  Scott A. Brandt,et al.  Visual tracking for intelligent vehicle-highway systems , 1996 .

[6]  Azriel Rosenfeld,et al.  Segmentation and Estimation of Image Region Properties through Cooperative Hierarchial Computation , 1981, IEEE Transactions on Systems, Man, and Cybernetics.

[7]  Pradeep K. Khosla,et al.  CHIMERA II: a real-time multiprocessing environment for sensor-based robot control , 1989, Proceedings. IEEE International Symposium on Intelligent Control 1989.

[8]  Antti J. Koivo,et al.  Real-time vision feedback for servoing robotic manipulator with self-tuning controller , 1991, IEEE Trans. Syst. Man Cybern..

[9]  Peter K. Allen,et al.  Real-time visual servoing , 1991, Proceedings. 1991 IEEE International Conference on Robotics and Automation.

[10]  B. Ghosh,et al.  A Perspective Theory for Motion and Shape Estimation in Machine Vision , 1995 .

[11]  Pradeep K. Khosla,et al.  Strategies for Increasing the Tracking Region of an Eye-in-Hand System by Singularity and Joint Limit Avoidance , 1993, [1993] Proceedings IEEE International Conference on Robotics and Automation.

[12]  Nikolaos Papanikolopoulos Controlled active vision , 1992 .

[13]  Lee E. Weiss,et al.  Dynamic sensor-based control of robots with visual feedback , 1987, IEEE Journal on Robotics and Automation.

[14]  Hiroshi Murase,et al.  Learning, positioning, and tracking visual appearance , 1994, Proceedings of the 1994 IEEE International Conference on Robotics and Automation.

[15]  Edward H. Adelson,et al.  The Laplacian Pyramid as a Compact Image Code , 1983, IEEE Trans. Commun..

[16]  E D Dickmanns,et al.  AUTONOMOUS HIGH SPEED ROAD VEHICLE GUIDANCE BY COMPUTER VISION , 1987 .

[17]  Ramesh C. Jain,et al.  Segmentation of Frame Sequences Obtained by a Moving Observer , 1984, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[18]  Its'hak Dinstein A new technique for visual motion alarm , 1988, Pattern Recognit. Lett..

[19]  Nikolaos Papanikolopoulos,et al.  Controlled active exploration of uncalibrated environments , 1994, 1994 Proceedings of IEEE Conference on Computer Vision and Pattern Recognition.

[20]  Peter K. Allen,et al.  Automated tracking and grasping of a moving object with a robotic hand-eye system , 1993, IEEE Trans. Robotics Autom..