High precision target tracking with a compound-eye image sensor

Rapidly moving objects are a challenge to track with a camera system, particularly when the range varies from near-field to far-field and there is a complex background. In the present work, the problem is restricted to a well-defined small object that is readily distinguished from the background scene. The emphasis is on the maintenance of an accurate trajectory estimation as the target moves from the near field to the far field (or vice versa) and traverses a large field of view. To address this and related scenarios, a prototype compound-eye image sensor, named "DragonflEYE", has been designed and fabricated. In essence, the compound-eye sensor uses a large number (10/sup 1/-10/sup 3/) of identical "eyelets" to cover a large angular field of view. The degree of overlap of the eyelet fields of view is an important system parameter that strongly influences the tracking problem. Selected highlights of the design and implementation of DragonflEYE are described. For the precision-tracking function, the following factors need to be assessed: coverage overlap, update rate, frame timing, subpixel calibration of multiple eyelet lenses and processing algorithms and approaches. Investigations with conventional cameras indicate that the instantaneous location precision can be 0.01 to 0.1 pixel for the centroid of a suitable 'spot object' in a favourable scenario. The calibration of multiple "eyelets" of the compound eye to the same level of precision is expected to be important for precision tracking over a wide field of view. Preliminary experimental results for calibration for tracking applications are presented and evaluated.

[1]  Ji Soo Lee,et al.  Object location and centroiding techniques with CMOS active pixel sensors , 2003 .

[2]  Emanuele Trucco,et al.  Introductory techniques for 3-D computer vision , 1998 .

[3]  Youfu Li,et al.  Planar pattern for automatic camera calibration , 2003 .

[4]  Carl E. Halford,et al.  Design and analysis of apposition compound eye optical sensors , 1995 .

[5]  P. Eggenberger,et al.  Evolving the morphology of a compound eye on a robot , 1999, 1999 Third European Workshop on Advanced Mobile Robots (Eurobot'99). Proceedings (Cat. No.99EX355).

[6]  J. Neumann,et al.  Multi-camera networks: eyes from eyes , 2000, Proceedings IEEE Workshop on Omnidirectional Vision (Cat. No.PR00704).

[7]  Richard Hornsey,et al.  Calibration techniques for object tracking using a compound eye image sensor , 2004, SPIE Security + Defence.

[8]  N. Franceschini,et al.  From insect vision to robot vision , 1992 .

[9]  Richard Hornsey,et al.  Electronic compound-eye image sensor: construction and calibration , 2004, IS&T/SPIE Electronic Imaging.

[10]  Aldo Cumani ComputerVisionLab A Simple Camera Calibration Method , 2002 .

[11]  Boubakeur Boufama,et al.  A semi-automatic camera calibration method for augmented reality , 2002, IEEE International Conference on Systems, Man and Cybernetics.

[12]  Isao Shimoyama,et al.  A small-sized panoramic scanning visual sensor inspired by the fly's compound eye , 1998, Proceedings. 1998 IEEE International Conference on Robotics and Automation (Cat. No.98CH36146).

[13]  Steven A. Shafer,et al.  What is the center of the image? , 1993, Proceedings of IEEE Conference on Computer Vision and Pattern Recognition.

[14]  Y. Aloimonos,et al.  Complete calibration of a multi-camera network , 2000, Proceedings IEEE Workshop on Omnidirectional Vision (Cat. No.PR00704).

[15]  Brian J. Thompson,et al.  Selected papers on natural and artificial compound eye sensors , 1996 .