Deep Learning-Based Fusion of Visible Light Positioning and IMU Sensors

Indoor Positioning System (IPS) allows us to precisely control robots and cars, and to improve the location awareness for pedestrians in indoor area where Global Positioning System fails. Many indoor positioning techniques leverage wireless signals, such as WiFi and Bluetooth, but suffer from low accuracy. Therefore, fusion techniques have been proposed to improve the accuracy, such as using the inertial measurement unit (IMU). In this paper, we proposed a deep learning-based fusion system using IMU and Visible Light Positioning (VLP). Different from WiFi and Bluetooth systems, VLP can provide highly accurate and fast location service with several line-of-sight (LOS) visible lights due to the special characteristic of light. Although VLP could fail in non-line-of-sight (NLOS) condition, the fusion system that we proposed is based on several structured neural networks and can carefully select VLP signals to achieve better positioning results even under poor LOS conditions. Based on our simulation results, the system can achieve an average of 20 cm accuracy.

[1]  Naser El-Sheimy,et al.  Indoor navigation: state of the art and future trends , 2021, Satellite Navigation.

[2]  C. Patrick Yue,et al.  Robust Robotic Localization Using Visible Light Positioning and Inertial Fusion , 2021, IEEE Sensors Journal.

[3]  Naser El-Sheimy,et al.  Inertial sensors technologies for navigation applications: state of the art and future trends , 2020 .

[4]  Zhouzheng Gao,et al.  Visible Light Positioning and Navigation Using Noise Measurement and Mitigation , 2019, IEEE Transactions on Vehicular Technology.

[5]  Ming Liu,et al.  Towards Robust Visible Light Positioning Under LED Shortage by Visual-inertial Fusion , 2019, 2019 International Conference on Indoor Positioning and Indoor Navigation (IPIN).

[6]  Naser El-Sheimy,et al.  The Integration of Photodiode and Camera for Visible Light Positioning by Using Fixed-Lag Ensemble Kalman Smoother , 2019, Remote. Sens..

[7]  Sachini Herath,et al.  RoNIN: Robust Neural Inertial Navigation in the Wild: Benchmark, Evaluations, & New Methods , 2019, 2020 IEEE International Conference on Robotics and Automation (ICRA).

[8]  Wei Wu,et al.  Selective Sensor Fusion for Neural Visual-Inertial Odometry , 2019, 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR).

[9]  Jiansong Zhang,et al.  Polarization-Based Visible Light Positioning , 2019, IEEE Transactions on Mobile Computing.

[10]  John Thompson,et al.  A Survey of Positioning Systems Using Visible LED Lights , 2018, IEEE Communications Surveys & Tutorials.

[11]  Agathoniki Trigoni,et al.  IONet: Learning to Cure the Curse of Drift in Inertial Odometry , 2018, AAAI.

[12]  Jianqiang Li,et al.  PSOTrack: A RFID-Based System for Random Moving Objects Tracking in Unconstrained Indoor Environment , 2018, IEEE Internet of Things Journal.

[13]  Weiwei Xia,et al.  A VLC and IMU integration indoor positioning algorithm with weighted unscented Kalman filter , 2017, 2017 3rd IEEE International Conference on Computer and Communications (ICCC).

[14]  Naser El-Sheimy,et al.  Evaluation of Two WiFi Positioning Systems Based on Autonomous Crowdsourcing of Handheld Devices for Indoor Navigation , 2016, IEEE Transactions on Mobile Computing.

[15]  Naser El-Sheimy,et al.  Smartphone-Based Indoor Localization with Bluetooth Low Energy Beacons , 2016, Sensors.

[16]  Jiming Chen,et al.  Last-Mile Navigation Using Smartphones , 2015, MobiCom.

[17]  Henk Wymeersch,et al.  UWB Positioning with Generalized Gaussian Mixture Filters , 2014, IEEE Transactions on Mobile Computing.