Object Localization and Size Measurement Using Networked Address Event Representation Imagers

This letter presents a novel object localization and size measurement method using networked address event representation cameras. The algorithm uses circle of Apollonius to calculate results without computing trigonometric functions. Experiment results show that the localization and size error is under 5%. The algorithm has low communication and computation complexity; therefore, it fits low-power applications, such as identifying and tracking moving objects in the field.

[1]  Chih-Hsien Hsia,et al.  A Stereo Vision-Based Self-Localization System , 2013, IEEE Sensors Journal.

[2]  Jun-Sik Kim,et al.  Visual SLAM by Single-Camera Catadioptric Stereo , 2006, 2006 SICE-ICASE International Joint Conference.

[3]  Shoushun Chen,et al.  Live demonstration: A dynamic vision sensor with direct logarithmic output and full-frame picture-on-demand , 2016, 2016 IEEE International Symposium on Circuits and Systems (ISCAS).

[4]  Wei Tang,et al.  Asynchronous communication for wireless sensors using ultra wideband impulse radio , 2015, 2015 IEEE 58th International Midwest Symposium on Circuits and Systems (MWSCAS).

[5]  Shoushun Chen,et al.  A 64×64 pixels UWB wireless temporal-difference digital image sensor , 2010, ISCAS.

[6]  Shoushun Chen,et al.  A 64 $\times$ 64 Pixels UWB Wireless Temporal-Difference Digital Image Sensor , 2012, IEEE Transactions on Very Large Scale Integration (VLSI) Systems.

[7]  Prashant J. Shenoy,et al.  SensEye: a multi-tier camera sensor network , 2005, ACM Multimedia.

[8]  Shadrokh Samavi,et al.  Geometrical Analysis of Localization Error in Stereo Vision Systems , 2013, IEEE Sensors Journal.