An embedded AER dynamic vision sensor for low-latency pole balancing

Balancing small objects such as a normal pencil on its tip requires rapid feedback control with latencies on the order of milliseconds. Here we describe how a pair of spike-based silicon retina dynamic vision sensors (DVS) is used to provide fast visual feedback for controlling an actuated table to balance an ordinary pencil on its tip. Two DVSs view the pencil from right angles. Movements of the pencil cause spike address-events (AEs) to be emitted from the DVSs. These AEs are processed by a 32-bit fixed-point ARM7 microcontroller (64MHz, 200mW) on the back side of each embedded DVS board (eDVS). Each eDVS updates its estimate of the pencil's location and angle in 2d space for each received spike (typically at a rate of 100kHz) by applying a continuous tracking method based on spike-driven fitting to a model of the vertical rod-like shape of the pencil. Every 2ms, each eDVS sends the pencil's tracked position to a third ARM7-based controller, which computes pencil location in 3d space and runs a linear PD-controller to adjust X-Y-position and velocity of the table to maintain the pencil balanced upright. The actuated table is built using ordinary high-speed hobby servos. Our system can balance any small, thin object such as a pencil, pen, chop-stick, or rod for minutes, in a wide range of light conditions.