Vision Based Localization for Infrastructure Enabled Autonomy

Infrastructure Enabled Autonomy (IEA) is a new paradigm in autonomous vehicles research that aims at distributed intelligence architecture by transferring the core functionalities of sensing and localization to infrastructure. This paradigm is also promising in designing scalable systems that enable autonomous car platooning on highways. This paper gives a detailed description about the experimental realization of IEA and techniques devised to localize a vehicle in such a setup. A reliable camera calibration technique for such an experimental setup is discussed, followed by a technique to transform 2D image coordinates to 3D world coordinates. In this research, localization information is received from on-board vehicle sensors like GPS/IMU, and (2) localized vehicle position data derived from deep learning, and 2D to 3D coordinate transformations on the real-time camera feeds and (3) lane detection data from infrastructure cameras. This data is fused together utilizing an Extended Kalman Filter (EKF) to obtain reliable estimates of the position of the vehicle at 50 Hz. This position information is then used to control the vehicle with an objective of following a prescribed path. Extensive simulation and experimental results are also presented to corroborate the performance of the proposed approach.