Going through a window and landing a quadrotor using optical flow

This paper considers the problem of controlling a quadrotor to go through a window and land on a planar target, using an image-based controller, with only a camera and an Inertial Measurement Unit (IMU) as sensors. The maneuver is divided into two stages: crossing the window and landing on the target plane. For the first stage, a control law is proposed that guarantees that the vehicle will not collide with the wall containing the window and will go through the window with non-zero velocity along the direction orthogonal to the window, keeping at all times a safety distance with respect to the window edges. For the landing stage, the proposed control law ensures that the vehicle achieves a smooth touchdown, keeping at all time a positive height above the target plane. For control purposes, the centroids of the images of a collection of landmarks (corners) for both the window and the target are used as position measurement. The translational optical flow relative to the wall, window edges, and target plane is used as velocity cue. To achieve the proposed objective, no direct measurements of position or velocity are used and no explicit estimate of the height above the target plane or of the distance to the wall is required. Simulation results are provided to illustrate the performance of the proposed controller.

[1]  Roland Siegwart,et al.  MAV navigation through indoor corridors using optical flow , 2010, 2010 IEEE International Conference on Robotics and Automation.

[2]  Claude Samson,et al.  Optical-Flow Based Strategies for Landing VTOL UAVs in Cluttered Environments , 2014 .

[3]  Rita Cunha,et al.  Homing on a moving dock for a quadrotor vehicle , 2015, TENCON 2015 - 2015 IEEE Region 10 Conference.

[4]  Hilbert J. Kappen,et al.  Efficient Optical Flow and Stereo Vision for Velocity Estimation and Obstacle Avoidance on an Autonomous Pocket Drone , 2016, IEEE Robotics and Automation Letters.

[5]  Robert E. Mahony,et al.  Landing a VTOL Unmanned Aerial Vehicle on a Moving Platform Using Optical Flow , 2012, IEEE Transactions on Robotics.

[6]  Rita Cunha,et al.  Landing of a Quadrotor on a Moving Target Using Dynamic Image-Based Visual Servo Control , 2016, IEEE Transactions on Robotics.

[7]  Kazuya Yoshida,et al.  Collaborative mapping of an earthquake‐damaged building via ground and aerial robots , 2012, J. Field Robotics.

[8]  Vijay Kumar,et al.  Estimation, Control, and Planning for Aggressive Flight With a Small Quadrotor With a Single Camera and IMU , 2017, IEEE Robotics and Automation Letters.

[9]  Dario Floreano,et al.  A method for ego-motion estimation in micro-hovering platforms flying in very cluttered environments , 2016, Auton. Robots.

[10]  Robert E. Mahony,et al.  Image-based visual servo control of aerial robotic systems using linear image features , 2005, IEEE Transactions on Robotics.

[11]  Andrew J. Barry High‐speed autonomous obstacle avoidance with pushbroom stereo , 2018, J. Field Robotics.

[12]  H. W. Ho,et al.  Optical-flow based self-supervised learning of obstacle appearance applied to MAV landing , 2018, Robotics Auton. Syst..