Real-time vision-based tracking control of an unmanned vehicle

Recent advances in manufacturing automation have motivated vision-based control of autonomous vehicles operated in unattended factories, material handling processes, warehouse operations, and hazardous environment explorations. Existing vision-based tracking control systems of autonomous vehicles, however, have been limited in real-time applications due to slow and/or expensive visual feedback and complicated dynamics and control with nonholonomic constraints. This paper presents a practical real-time vision-based tracking control system of an unmanned vehicle, ViTra. Unlike the conventional RS170 video-based machine vision systems, ViTra uses a DSP-based flexible integrated vision system (FIVS) which is characterized by low cost, computational efficiency, control flexibility, and a friendly user interface. In particular, this paper focuses on developing a framework for vision tracking systems, designing generic fiducial patterns, and applying real-time vision systems to tracking control of autonomous vehicles. A laboratory prototype vision-based tracking system developed at Georgia Institute of Technology permits the uniquely designed fiducial landmarks to be evaluated experimentally, the control strategy and the path planning algorithm derived in the paper to be validated in real-time, and the issues of simplifying nonlinear dynamics and dealing with nonholonomic constraints to be addressed in practice. Experimental results reveal interesting insights into the design, manufacture, modeling, and control of vision-based tracking control systems of autonomous vehicles.