Perception-aware Path Planning for UAVs using Semantic Segmentation

In this work, we present a perception-aware pathplanning pipeline for Unmanned Aerial Vehicles (UAVs) for navigation in challenging environments. The objective is to reach a given destination safely and accurately by relying on monocular camera-based state estimators, such as Keyframebased Visual-Inertial Odometry (VIO) systems. Motivated by the recent advances in semantic segmentation using deep learning, our path-planning architecture takes into consideration the semantic classes of parts of the scene that are perceptually more informative than others. This work proposes a planning strategy capable of avoiding both texture-less regions and problematic areas, such as lakes and oceans, that may cause large drift or failures in the robot’s pose estimation, by using the semantic information to compute the next best action with respect to perception quality. We design a hierarchical planner, composed of an A∗ path-search step followed by B-Spline trajectory optimization. While the A∗ steers the UAV towards informative areas, the optimizer keeps the most promising landmarks in the camera’s field of view. We extensively evaluate our approach in a set of photo-realistic simulations, showing a remarkable improvement with respect to the state-of-the-art in active perception.

[1]  Davide Scaramuzza,et al.  Perception-aware Receding Horizon Navigation for MAVs , 2018, 2018 IEEE International Conference on Robotics and Automation (ICRA).

[2]  Vijay Kumar,et al.  Minimum snap trajectory generation and control for quadrotors , 2011, 2011 IEEE International Conference on Robotics and Automation.

[3]  Yiannis Aloimonos,et al.  Active vision , 2004, International Journal of Computer Vision.

[4]  Paolo Valigi,et al.  Exploiting Photometric Information for Planning Under Uncertainty , 2015, ISRR.

[5]  Nicholas Roy,et al.  Global A-Optimal Robot Exploration in SLAM , 2005, Proceedings of the 2005 IEEE International Conference on Robotics and Automation.

[6]  Fei Gao,et al.  Robust and Efficient Quadrotor Trajectory Generation for Fast Autonomous Flight , 2019, IEEE Robotics and Automation Letters.

[7]  Marco Pavone,et al.  A convex optimization approach to smooth trajectories for motion planning with car-like robots , 2015, 2015 54th IEEE Conference on Decision and Control (CDC).

[8]  Nicholas Roy,et al.  Rapidly-exploring Random Belief Trees for motion planning under uncertainty , 2011, 2011 IEEE International Conference on Robotics and Automation.

[9]  Liam Paull,et al.  Information-based Active SLAM via topological feature graphs , 2015, 2016 IEEE 55th Conference on Decision and Control (CDC).

[10]  Roland Siegwart,et al.  A synchronized visual-inertial sensor system with FPGA pre-processing for accurate real-time SLAM , 2014, ICRA 2014.

[11]  Roland Siegwart,et al.  A robust and modular multi-sensor fusion approach applied to MAV navigation , 2013, 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[12]  John K. Tsotsos,et al.  Revisiting active perception , 2016, Autonomous Robots.

[13]  Roland Siegwart,et al.  RotorS—A Modular Gazebo MAV Simulator Framework , 2016 .

[14]  Margarita Chli,et al.  Autonomous Aerial Inspection Using Visual-Inertial Robust Localization and Mapping , 2017, FSR.

[15]  Sebastian Thrun,et al.  Path Planning for Autonomous Vehicles in Unknown Semi-structured Environments , 2010, Int. J. Robotics Res..

[16]  Ali Farhadi,et al.  YOLOv3: An Incremental Improvement , 2018, ArXiv.

[17]  Christos Papachristos,et al.  Uncertainty-aware receding horizon exploration and mapping using aerial robots , 2017, 2017 IEEE International Conference on Robotics and Automation (ICRA).

[18]  W MurrayDavid,et al.  Simultaneous Localization and Map-Building Using Active Vision , 2002 .

[19]  Sertac Karaman,et al.  Perception-aware trajectory generation for aggressive quadrotor flight using differential flatness , 2019, 2019 American Control Conference (ACC).

[20]  Daniel Cremers,et al.  Real-time trajectory replanning for MAVs using uniform B-splines and a 3D circular buffer , 2017, 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[21]  Frank Dellaert,et al.  Towards Planning in Generalized Belief Space , 2013, ISRR.

[22]  Alessandro Farinelli,et al.  Waterline and obstacle detection in images from low-cost autonomous boats for environmental monitoring , 2020, Robotics Auton. Syst..

[23]  N. Roy,et al.  The Belief Roadmap: Efficient Planning in Belief Space by Factoring the Covariance , 2009, Int. J. Robotics Res..

[24]  Shaojie Shen,et al.  VINS-Mono: A Robust and Versatile Monocular Visual-Inertial State Estimator , 2017, IEEE Transactions on Robotics.

[25]  Nancy M. Amato,et al.  FIRM: Sampling-based feedback motion-planning under motion uncertainty and imperfect measurements , 2014, Int. J. Robotics Res..

[26]  Nikolay Atanasov,et al.  Active information acquisition with mobile robots , 2015 .

[27]  Leslie Pack Kaelbling,et al.  Planning and Acting in Partially Observable Stochastic Domains , 1998, Artif. Intell..