Efficient neuromorphic optomotor heading regulation

“Neuromorphic” vision sensors are a recent development in sensing technology. They can be thought of a camera sensor whose output is a sequence of “retinal events” rather than frames. Events are generated independently by each pixel as they detect a change in the light field. These sensors have low latency (<;10 microseconds), high dynamic range (>120 dB), and very low power consumption. Therefore, they are well suited for control applications where power is limited yet high performance is necessary. Existing computer vision algorithms that work on frames cannot be adapted to process retinal events from neuromorphic sensors, so a new class of algorithms needs to be investigated. This papers considers the problem of designing a regulator for the heading of a vehicle based on the feedback from an on-board neuromorphic sensor. It is shown that a nonlinear function of the events retinal positions, followed by retinal integration, followed by a linear filter is a simple design that is sufficient to guarantee stability. This shows that computationally simple controllers are sufficient to control motion tasks even with the feedback from noisy and ambiguous event data, and without having to compute explicit representations for the state.

[1]  Mark H. A. Davis Linear estimation and stochastic control , 1977 .

[2]  Ling Shi,et al.  Event-Based Sensor Data Scheduling: Trade-Off Between Communication Rate and Estimation Quality , 2013, IEEE Transactions on Automatic Control.

[3]  Tobi Delbrück,et al.  A 128$\times$ 128 120 dB 15 $\mu$s Latency Asynchronous Temporal Contrast Vision Sensor , 2008, IEEE Journal of Solid-State Circuits.

[4]  Shih-Chii Liu,et al.  Neuromorphic sensory systems , 2010, Current Opinion in Neurobiology.

[5]  Tobi Delbrück,et al.  Asynchronous Event-Based Binocular Stereo Matching , 2012, IEEE Transactions on Neural Networks and Learning Systems.

[6]  Jason L. Speyer,et al.  Stochastic Processes, Estimation, and Control , 2008, Advances in design and control.

[7]  Davide Scaramuzza,et al.  Low-latency event-based visual odometry , 2014, 2014 IEEE International Conference on Robotics and Automation (ICRA).

[8]  Davide Scaramuzza,et al.  Event-based, 6-DOF pose tracking for high-speed maneuvers , 2014, 2014 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[9]  Karl Johan Åström,et al.  Event Based Control , 2008 .

[10]  Tobi Delbrück,et al.  Fast sensory motor control based on event-based hybrid neuromorphic-procedural system , 2007, 2007 IEEE International Symposium on Circuits and Systems.

[11]  Tobi Delbrück,et al.  Low-latency localization by active LED markers tracking using a dynamic vision sensor , 2013, 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[12]  Chiara Bartolozzi,et al.  Event-Based Visual Flow , 2014, IEEE Transactions on Neural Networks and Learning Systems.

[13]  Tobi Delbrück,et al.  A pencil balancing robot using a pair of AER dynamic vision sensors , 2009, 2009 IEEE International Symposium on Circuits and Systems.

[14]  Paulo Tabuada,et al.  An introduction to event-triggered and self-triggered control , 2012, 2012 IEEE 51st IEEE Conference on Decision and Control (CDC).