FEEL: Fast, Energy-Efficient Localization for Autonomous Indoor Vehicles

Autonomous vehicles have created a sensation in both indoor and outdoor applications. The famous indoor use-case is process automation inside a warehouse using Autonomous Indoor Vehicles (AIV). These vehicles need to locate themselves not only with an accuracy of a few centimeters but also within a few milliseconds in an energy-efficient manner. Due to these challenges, localization is a holy grail. In this paper, we propose FEEL – an indoor localization system that uses a fusion of three low-energy sensors: IMU, UWB, and radar. We provide detailed software and hardware architecture of FEEL. Further, we propose Adaptive Sensing Algorithm (ASA) for optimizing for localization accuracy and energy consumption of FEEL by adjusting the sensing rate to the dynamics of the physical environment in real-time. Our extensive performance evaluation over diverse test settings reveals that FEEL provides a localization accuracy of sub-7 cm with an ultra-low latency of around 3 ms. Additionally, ASA yields up to 20% energy savings with only a marginal trade off in accuracy. Furthermore, we show that FEEL outperforms state of the art in indoor localization.

[1]  José Martínez-Carranza,et al.  Towards High-Speed Localisation for Autonomous Drone Racing , 2019, MICAI.

[2]  Sabbir Ahmed,et al.  Accurate UWB and IMU based indoor localization for autonomous robots , 2017, 2017 IEEE 30th Canadian Conference on Electrical and Computer Engineering (CCECE).

[3]  Zaher Dawy,et al.  The IEEE 1918.1 “Tactile Internet” Standards Working Group and its Standards , 2019, Proceedings of the IEEE.

[4]  Haryong Song,et al.  Robust Vision-Based Relative-Localization Approach Using an RGB-Depth Camera and LiDAR Sensor Fusion , 2016, IEEE Transactions on Industrial Electronics.

[5]  Andreas Eichhorn,et al.  Accurate 3D Positioning for a Mobile Platform in Non-Line-of-Sight Scenarios Based on IMU/Magnetometer Sensor Fusion , 2018, Sensors.

[6]  Dirk T. M. Slock,et al.  Robust and low complexity Bayesian data fusion for hybrid cooperative vehicular localization , 2017, 2017 IEEE International Conference on Communications (ICC).

[7]  Yong Wang,et al.  Indoor UAV Localization using Manifold Alignment with Mobile AP Detection , 2019, ICC 2019 - 2019 IEEE International Conference on Communications (ICC).

[8]  Juliette Marais,et al.  GNSS Position Integrity in Urban Environments: A Review of Literature , 2018, IEEE Transactions on Intelligent Transportation Systems.

[9]  Chunlong He,et al.  Kalman-Filter-Based Integration of IMU and UWB for High-Accuracy Indoor Positioning and Navigation , 2020, IEEE Internet of Things Journal.

[10]  Hao Wang,et al.  Robust and Precise Vehicle Localization Based on Multi-Sensor Fusion in Diverse City Scenes , 2017, 2018 IEEE International Conference on Robotics and Automation (ICRA).

[11]  Martin Vossiek,et al.  Multi-modal sensor fusion for indoor mobile robot pose estimation , 2016, 2016 IEEE/ION Position, Location and Navigation Symposium (PLANS).

[12]  Weikun Zhen,et al.  Estimating the Localizability in Tunnel-like Environments using LiDAR and UWB , 2019, 2019 International Conference on Robotics and Automation (ICRA).

[13]  Yang Song,et al.  UWB/LiDAR Fusion For Cooperative Range-Only SLAM , 2018, 2019 International Conference on Robotics and Automation (ICRA).

[14]  Leehter Yao,et al.  An integrated IMU and UWB sensor based indoor positioning system , 2017, 2017 International Conference on Indoor Positioning and Indoor Navigation (IPIN).