Event-based Sensing for Space Situational Awareness

A revolutionary type of imaging device, known as a silicon retina or event-based sensor, has recently been developed and is gaining in popularity in the field of artificial vision systems. These devices are inspired by a biological retina and operate in a significantly different way to traditional CCD-based imaging sensors. While a CCD produces frames of pixel intensities, an event-based sensor produces a continuous stream of events, each of which is generated when a pixel detects a change in log light intensity. These pixels operate asynchronously and independently, producing an event-based output with high temporal resolution. There are also no fixed exposure times, allowing these devices to offer a very high dynamic range independently for each pixel. Additionally, these devices offer high-speed, low power operation and a sparse spatio-temporal output. As a consequence, the data from these sensors must be interpreted in a significantly different way to traditional imaging sensors and this paper explores the advantages this technology provides for space imaging. The applicability and capabilities of event-based sensors for SSA applications are demonstrated through telescope field trials. Trial results have confirmed that the devices are capable of observing resident space objects from LEO through to GEO orbital regimes. Significantly, observations of RSOs were made during both day-time and night-time (terminator) conditions without modification to the camera or optics. The event based sensor’s ability to image stars and satellites during day-time hours offers a dramatic capability increase for terrestrial optical sensors. This paper shows the field testing and validation of two different architectures of event-based imaging sensors. An event-based sensor’s asynchronous output has an intrinsically low data-rate. In addition to low-bandwidth communications requirements, the low weight, low-power and high-speed make them ideally suitable to meeting the demanding challenges required by space-based SSA systems. Results from these experiments and the systems developed highlight the applicability of event-based sensors to ground and space-based SSA tasks.

[1]  Tobi Delbruck,et al.  Robotic goalie with 3 ms reaction time at 4% CPU load using event-based dynamic vision sensor , 2013, Front. Neurosci..

[2]  Derong Liu TNNLS Call for Reviewers and Special Issues , 2015, IEEE Trans. Neural Networks Learn. Syst..

[3]  Ryad Benosman,et al.  Asynchronous Event-Based Multikernel Algorithm for High-Speed Visual Features Tracking , 2015, IEEE Transactions on Neural Networks and Learning Systems.

[4]  C. D. De Dreu,et al.  Oxytocin-Motivated Ally Selection is Moderated by Fetal Testosterone Exposure and Empathic Concern , 2012, Front. Neurosci..

[5]  Ryad Benosman,et al.  Asynchronous event‐based high speed vision for microparticle tracking , 2012 .

[6]  T. Delbruck,et al.  > Replace This Line with Your Paper Identification Number (double-click Here to Edit) < 1 , 2022 .

[7]  Tobi Delbruck,et al.  A 240 × 180 130 dB 3 µs Latency Global Shutter Spatiotemporal Vision Sensor , 2014, IEEE Journal of Solid-State Circuits.

[8]  S. Nagata,et al.  An electronic model of the retina , 1970 .

[9]  Davide Scaramuzza,et al.  Low-latency visual odometry using event-based feature tracks , 2016, 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[10]  C. P. A C O R E T,et al.  Asynchronous event-based high speed vision for microparticle tracking , 2011 .

[11]  R. Fernald Evolution of eyes , 2000, Current Opinion in Neurobiology.

[12]  Stephan Schraml,et al.  CARE: A dynamic stereo vision sensor system for fall detection , 2012, 2012 IEEE International Symposium on Circuits and Systems.

[13]  Tobi Delbrück,et al.  A 128$\times$ 128 120 dB 15 $\mu$s Latency Asynchronous Temporal Contrast Vision Sensor , 2008, IEEE Journal of Solid-State Circuits.

[14]  Davide Scaramuzza,et al.  Event-based, 6-DOF pose tracking for high-speed maneuvers , 2014, 2014 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[15]  Ryad Benosman,et al.  Asynchronous Event-Based Visual Shape Tracking for Stable Haptic Feedback in Microrobotics , 2012, IEEE Transactions on Robotics.

[16]  Garrick Orchard,et al.  Skimming Digits: Neuromorphic Classification of Spike-Encoded Images , 2016, Front. Neurosci..

[17]  John P. John,et al.  Assessing Neurocognition via Gamified Experimental Logic: A Novel Approach to Simultaneous Acquisition of Multiple ERPs , 2016, Front. Neurosci..

[18]  Giacomo Indiveri,et al.  Frontiers in Neuromorphic Engineering , 2011, Front. Neurosci..

[19]  Eugenio Culurciello,et al.  Activity-driven, event-based vision sensors , 2010, Proceedings of 2010 IEEE International Symposium on Circuits and Systems.

[20]  Daniel Matolin,et al.  A QVGA 143 dB Dynamic Range Frame-Free PWM Image Sensor With Lossless Pixel-Level Video Compression and Time-Domain CDS , 2011, IEEE Journal of Solid-State Circuits.

[21]  Kwabena Boahen,et al.  Point-to-point connectivity between neuromorphic chips using address events , 2000 .

[22]  Misha A. Mahowald,et al.  An Analog VLSI System for Stereoscopic Vision , 1994 .

[23]  Tobi Delbrück,et al.  A 128 X 128 120db 30mw asynchronous vision sensor that responds to relative intensity change , 2006, 2006 IEEE International Solid State Circuits Conference - Digest of Technical Papers.