Enhanced Omnidirectional Image Reconstruction Algorithm and Its Real-Time Hardware

Omnidirectional stereoscopy and depth estimation are complex problems of image processing to which the Panoptic camera offers a novel solution. The Panoptic camera is a biologically-inspired vision sensor made of multiple cameras. It is a polydioptric system mimicking the eyes of flying insects where multiple imagers, each with a distinct focal point, are distributed over a hemisphere. Recently, the omnidirectional image reconstruction algorithm (OIR) and its real-time hardware implementation have been proposed for the Panoptic camera. This paper presents an enhanced omnidirectional image reconstruction algorithm (EOIR) and its real-time implementation. The proposed EOIR algorithm provides improved realistic omnidirectional images and residuals compared to OIR. As a processing core of EOIR, 57% of the available slice resources in a Virtex 5 FPGA are consumed. The proposed platform provides the high bandwidth required to simultaneously process data originating from 40 cameras, and reconstruct omnidirectional images of 256x1024 pixels at 25 fps. This proposed hardware and algorithmic enhancements enable advanced real-time applications including omnidirectional image reconstruction, 3D model construction and depth estimation.

[1]  David San Segundo Bello,et al.  An insect eye-based image sensor with very large field of view , 2010, Photonics Europe.

[2]  Gang Cheng,et al.  Real-Time FPGA-Based Panoramic Unrolling of High-Resolution Catadioptric Omnidirectional Images , 2009, 2009 International Conference on Measuring Technology and Mechatronics Automation.

[3]  Pascal Fua,et al.  Multicamera People Tracking with a Probabilistic Occupancy Map , 2008, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[4]  Tao Liu,et al.  Wide Field of View CCD Camera Based on Multi-sensors Image Mosaics , 2008, 2008 Congress on Image and Signal Processing.

[5]  Laurent Jacques,et al.  The PANOPTIC Camera: A Plenoptic Sensor with Real-Time Omnidirectional Capability , 2013, J. Signal Process. Syst..

[6]  Ramesh Raskar,et al.  Krill-eye : Superposition compound eye for wide-angle imaging via GRIN lenses , 2009, 2009 IEEE 12th International Conference on Computer Vision Workshops, ICCV Workshops.

[7]  Mongi A. Abidi,et al.  Heterogeneous Fusion of Omnidirectional and PTZ Cameras for Multiple Object Tracking , 2008, IEEE Transactions on Circuits and Systems for Video Technology.

[8]  Peter Schreiber,et al.  Artificial compound eyes: different concepts and their application for ultraflat image acquisition sensors , 2004, SPIE MOEMS-MEMS.

[9]  Takeo Kanade,et al.  Virtual ized reality: constructing time-varying virtual worlds from real world events , 1997 .

[10]  Takeo Kanade,et al.  Virtualized reality: constructing time-varying virtual worlds from real world events , 1997, Proceedings. Visualization '97 (Cat. No. 97CB36155).

[11]  Kenjiro Hamanaka,et al.  An Artificial Compound Eye Using a Microlens Array and Its Application to Scale-Invariant Processing , 1996 .

[12]  Qionghai Dai,et al.  Continuous depth estimation for multi-view stereo , 2009, CVPR.

[13]  Pascal Frossard,et al.  A Variational Framework for Structure from Motion in Omnidirectional Image Sequences , 2011, Journal of Mathematical Imaging and Vision.

[14]  Marc Levoy,et al.  High performance imaging using large camera arrays , 2005, SIGGRAPH 2005.

[15]  Laurent Jacques,et al.  Hardware implementation of an omnidirectional camerawith real-time 3D imaging capability , 2011, 2011 3DTV Conference: The True Vision - Capture, Transmission and Display of 3D Video (3DTV-CON).

[16]  Jong-Eun Ha,et al.  Simple method for calibrating omnidirectional stereo with multiple cameras , 2011 .

[17]  Shree K. Nayar,et al.  Real-Time Omnidirectional and Panoramic Stereo , 1998 .