Towards evasive maneuvers with quadrotors using dynamic vision sensors

We present a method to predict collisions with objects thrown at a quadrotor using a pair of dynamic vision sensors (DVS). Due to the micro-second temporal resolution of these sensors and the sparsity of their output, the object's trajectory can be estimated with minimal latency. Unlike standard cameras that send frames at a fixed frame rate, a DVS only transmits pixel-level brightness changes (“events”) at the time they occur. Our method tracks spherical objects on the image plane using probabilistic trackers that are updated with each incoming event. The object's trajectory is estimated using an Extended Kalman Filter with a mixed state space that allows incorporation of both the object's dynamics and the measurement noise in the image plane. Using error-propagation techniques, we predict a collision if the 3σ-ellipsoid along the predicted trajectory intersects with a safety sphere around the quadrotor. We experimentally demonstrate that our method allows initiating evasive maneuvers early enough to avoid collisions.

[1]  Jörg Conradt,et al.  Simultaneous Localization and Mapping for Event-Based Vision Systems , 2013, ICVS.

[2]  Ryad Benosman,et al.  Asynchronous Event-Based Visual Shape Tracking for Stable Haptic Feedback in Microrobotics , 2012, IEEE Transactions on Robotics.

[3]  Vijay Kumar,et al.  Minimum snap trajectory generation and control for quadrotors , 2011, 2011 IEEE International Conference on Robotics and Automation.

[4]  Tobi Delbrück,et al.  A pencil balancing robot using a pair of AER dynamic vision sensors , 2009, 2009 IEEE International Symposium on Circuits and Systems.

[5]  Jörg Conradt,et al.  Event-based particle filtering for robot self-localization , 2012, 2012 IEEE International Conference on Robotics and Biomimetics (ROBIO).

[6]  Ryad Benosman,et al.  Asynchronous visual event-based time-to-contact , 2014, Front. Neurosci..

[7]  Tobi Delbrück,et al.  Live demonstration: Gesture-based remote control using stereo pair of dynamic vision sensors , 2012, 2012 IEEE International Symposium on Circuits and Systems.

[8]  Tobi Delbrück,et al.  Low-latency localization by active LED markers tracking using a dynamic vision sensor , 2013, 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[9]  Shih-Chii Liu,et al.  Neuromorphic sensory systems , 2010, Current Opinion in Neurobiology.

[10]  Chiara Bartolozzi,et al.  Event-Based Visual Flow , 2014, IEEE Transactions on Neural Networks and Learning Systems.

[11]  David N. Lee,et al.  A Theory of Visual Control of Braking Based on Information about Time-to-Collision , 1976, Perception.

[12]  Flavio Fontana,et al.  Autonomous, Vision‐based Flight and Live Dense 3D Mapping with a Quadrotor Micro Aerial Vehicle , 2016, J. Field Robotics.

[13]  T. Delbruck,et al.  > Replace This Line with Your Paper Identification Number (double-click Here to Edit) < 1 , 2022 .

[14]  Ryad Benosman,et al.  Asynchronous Event-Based Hebbian Epipolar Geometry , 2011, IEEE Transactions on Neural Networks.

[15]  Davide Scaramuzza,et al.  Event-based, 6-DOF pose tracking for high-speed maneuvers , 2014, 2014 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[16]  Vijay Kumar,et al.  Trajectory Generation and Control for Precise Aggressive Maneuvers with Quadrotors , 2010, ISER.

[17]  Raffaello D'Andrea,et al.  Quadrocopter pole acrobatics , 2013, 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[18]  Raffaello D'Andrea,et al.  Quadrocopter ball juggling , 2011, 2011 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[19]  Ryad Benosman,et al.  Asynchronous Event-Based Multikernel Algorithm for High-Speed Visual Features Tracking , 2015, IEEE Transactions on Neural Networks and Learning Systems.

[20]  Sergei Lupashin,et al.  A platform for aerial robotics research and demonstration: The Flying Machine Arena , 2014 .

[21]  Chiara Bartolozzi,et al.  Asynchronous frameless event-based optical flow , 2012, Neural Networks.

[22]  Tobi Delbrück,et al.  A 128$\times$ 128 120 dB 15 $\mu$s Latency Asynchronous Temporal Contrast Vision Sensor , 2008, IEEE Journal of Solid-State Circuits.

[23]  Tobi Delbruck,et al.  Robotic goalie with 3 ms reaction time at 4% CPU load using event-based dynamic vision sensor , 2013, Front. Neurosci..

[24]  Ryad Benosman,et al.  Event-based 3D reconstruction from neuromorphic retinas , 2013, Neural Networks.