Dynamic Alignment Control Using Depth Imagery for Automated Wheel Assembly

Abstract This paper presents a novel method for dynamic alignment control using infrared light depth imagery to enable automated wheel loading operation for the trim and final automotive assembly line. A key requirement for automated wheel loading is to track the motion of the wheel hub and simultaneously identify the spatial positions and angular orientations of its alignment features in real-time on a moving vehicle body. This requirement is met in this work, where low-cost infrared depth-imaging devices like Microsoft Kinect™ and Asus Xtion™, vastly popular in the gaming industry, are used to track a moving wheel hub and recognise alignment features on both the wheel hub and the wheel in real time in a laboratory environment. Accurate control instructions are then computed to instruct the automation system to rotate the wheel to achieve precise alignment with the wheel hub and load the wheel at the right time. Experimental results demonstrate that the reproducibility error in alignment control satisfies the assembly tolerance of 2 mm for the wheel loading operation, and thus the proposed method can be applied to automate wheel assembly on the trim and final automotive assembly line. The novelty of this work lies in its use of depth imaging for dynamic alignment control, which provides real-time spatial data in all 3 axes simultaneously as against the popularly reported RGB imaging techniques that are computationally more demanding, sensitive to ambient lighting and require the use of additional force sensors to obtain depth axis control data. This paper demonstrates the concept of a light-controlled factory where depth imaging using infrared light and depth image analysis is used to enable intelligent control in automation.