Event-driven ball detection and gaze fixation in clutter

The fast temporal-dynamics and intrinsic motion segmentation of event-based cameras are beneficial for robotic tasks that require low-latency visual tracking and control, for example a robot catching a ball. When the event-driven iCub humanoid robot grasps an object its head and torso move, inducing camera motion, and tracked objects become no longer trivially segmented amongst the mass of background clutter. Current event-based tracking algorithms have mostly considered stationary cameras that have clean event-streams with minimal clutter. This paper introduces novel methods to extend the Hough-based circle detection algorithm using optical flow information that is readily extracted from the spatio-temporal event space. Results indicate the proposed directed-Hough algorithm is more robust to other moving objects and the background event-clutter. Finally, we demonstrate successful on-line robot control and gaze following on the iCub robot.

[1]  Tobi Delbrück,et al.  A pencil balancing robot using a pair of AER dynamic vision sensors , 2009, 2009 IEEE International Symposium on Circuits and Systems.

[2]  Davide Scaramuzza,et al.  Lifetime estimation of events from Dynamic Vision Sensors , 2015, 2015 IEEE International Conference on Robotics and Automation (ICRA).

[3]  Tobias Brosch,et al.  On event-based optical flow detection , 2015, Front. Neurosci..

[4]  Josef Kittler,et al.  The Adaptive Hough Transform , 1987, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[5]  Tobi Delbrück,et al.  A 128$\times$ 128 120 dB 15 $\mu$s Latency Asynchronous Temporal Contrast Vision Sensor , 2008, IEEE Journal of Solid-State Circuits.

[6]  Giorgio Metta,et al.  Towards long-lived robot genes , 2008, Robotics Auton. Syst..

[7]  Giorgio Metta,et al.  Event-driven visual attention for the humanoid robot iCub , 2013, Front. Neurosci..

[8]  C. P. A C O R E T,et al.  Asynchronous event-based high speed vision for microparticle tracking , 2011 .

[9]  Davide Scaramuzza,et al.  Event-based, 6-DOF pose tracking for high-speed maneuvers , 2014, 2014 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[10]  Chiara Bartolozzi,et al.  Event-Based Visual Flow , 2014, IEEE Transactions on Neural Networks and Learning Systems.

[11]  Ryad Benosman,et al.  Asynchronous Event-Based Hebbian Epipolar Geometry , 2011, IEEE Transactions on Neural Networks.

[12]  T. Delbruck,et al.  > Replace This Line with Your Paper Identification Number (double-click Here to Edit) < 1 , 2022 .

[13]  Tobi Delbruck,et al.  Robotic goalie with 3 ms reaction time at 4% CPU load using event-based dynamic vision sensor , 2013, Front. Neurosci..

[14]  Pedro U. Lima,et al.  Tracking objects with generic calibrated sensors: An algorithm based on color and 3D shape features , 2010, Robotics Auton. Syst..

[15]  Chiara Bartolozzi,et al.  An Asynchronous Neuromorphic Event-Driven Visual Part-Based Shape Tracking , 2015, IEEE Transactions on Neural Networks and Learning Systems.

[16]  Jörg Conradt,et al.  Simultaneous Localization and Mapping for Event-Based Vision Systems , 2013, ICVS.

[17]  Ryad Benosman,et al.  Asynchronous Event-Based Multikernel Algorithm for High-Speed Visual Features Tracking , 2015, IEEE Transactions on Neural Networks and Learning Systems.