AUV SLAM using forward/downward looking cameras and artificial landmarks

Autonomous underwater vehicles (AUVs) are usually equipped with one or more optical cameras to obtain visual data of underwater environments. The camera can also be used to estimate the AUV's pose information, along with various navigation sensors such as inertial measurement unit (IMU), Doppler velocity log (DVL), depth sensor, and so on. In this paper, we propose a vision-based simultaneous localization and mapping (SLAM) of AUVs, where underwater artificial landmarks are used to help visual sensing of forward and downward looking cameras. Three types of landmarks are introduced and their detection algorithms are organized in a framework of conventional extended Kalman filter (EKF) SLAM to estimate both robot and landmark states. The proposed method is validated by an experiment performed in a engineering basin. Since DVL suffers from noises in a real ocean environment, we generated synthetic noisy data based on the real sensor data. With this data we verify that the proposed SLAM approach can recover from the erroneous dead reckoning position.