Development of a mobile robot for visually guided handling of material

Mobile robots frequently replace humans in handling and transporting wafer carriers in semiconductor production lines. A mobile robot is constructed in this paper. The developed mobile robot is primarily composed of a mobile base, a robot manipulator, and a vision system. Since the guidance control system of the mobile base inevitably causes positioning errors of the mobile base, this study employs the eye-in-hand vision system to provide visual information for controlling the manipulator of the mobile robot to grasp accurately stationary material during pick-and-place operations between a predefined station and the mobile robot. This work further proposes a position-based look-and-move task encoding control strategy for eye-in-hand vision architecture, that maintains all target features in the camera's field of view throughout the visual guiding. Moreover, the manipulator can quickly approach the material and precisely position the end-effector in the desired pose. Numerous techniques are required for implementing such a task, including image enhancement, edge detection, corner and centroid detection, camera model calibration method, robotic hand/eye calibration method, using a camera with controlled zoom and focus, and task encoding scheme. Finally, the theoretical results for the proposed control strategy are experimentally verified on the constructed mobile robot. Specific experimental demonstrations include grasping the target object with different locations on the station and grasping the target object tilted by different angles to the station.

[1]  Peter I. Corke,et al.  A tutorial on visual servo control , 1996, IEEE Trans. Robotics Autom..

[2]  Gregory D. Hager,et al.  Task re-encoding in vision-based control systems , 1997, Proceedings of the 36th IEEE Conference on Decision and Control.

[3]  H.N. Nair,et al.  Robust focus ranging , 1992, Proceedings 1992 IEEE Computer Society Conference on Computer Vision and Pattern Recognition.

[4]  Peter I. Corke,et al.  Dynamic effects in visual closed-loop systems , 1996, IEEE Trans. Robotics Autom..

[5]  Matthias Seitz Towards autonomous robotic servicing: Using an integrated hand-arm-eye system for manipulating unknown objects , 1999, Robotics Auton. Syst..

[6]  Zhengyou Zhang,et al.  A Flexible New Technique for Camera Calibration , 2000, IEEE Trans. Pattern Anal. Mach. Intell..

[7]  Joris De Schutter,et al.  Tool/camera configurations for eye-in-hand hybrid vision/force control , 2002, Proceedings 2002 IEEE International Conference on Robotics and Automation (Cat. No.02CH37292).