Omnidirectional video stabilisation on a virtual camera using sensor fusion

This paper presents a method for robustly stabilising omnidirectional video given the presence of significant rotations and translations by creating a virtual camera and using a combination of sensor fusion and scene tracking. Real time rotational movements of the camera are measured by an Inertial Measurement Unit (IMU), which provides an initial estimate of the ego-motion of the camera platform. Image registration is then used to refine these estimates. The calculated ego-motion is then used to adjust an extract of the omnidirectional video, forming a virtual camera that is focused on the scene. Experiments show the technique is effective under challenging ego-motions and overcomes deficiencies that are associated with unimodal approaches making it robust and suitable to be used in many surveillance applications.

[1]  Shree K. Nayar,et al.  Catadioptric omnidirectional camera , 1997, Proceedings of IEEE Computer Society Conference on Computer Vision and Pattern Recognition.

[2]  Dan Schonfeld,et al.  Online Video Stabilization Based on Particle Filters , 2006, 2006 International Conference on Image Processing.

[3]  J.-Y. Bouguet,et al.  Pyramidal implementation of the lucas kanade feature tracker , 1999 .

[4]  G LoweDavid,et al.  Distinctive Image Features from Scale-Invariant Keypoints , 2004 .

[5]  Simon J. Godsill,et al.  On sequential Monte Carlo sampling methods for Bayesian filtering , 2000, Stat. Comput..

[6]  Pankaj Kumar,et al.  Real Time Target Tracking with Pan Tilt Zoom Camera , 2009, 2009 Digital Image Computing: Techniques and Applications.

[7]  A. Doucet,et al.  On sequential sampling Monte Carlo sampling methods for Bayesian filtering , 2000 .

[8]  Jan Flusser,et al.  Image registration methods: a survey , 2003, Image Vis. Comput..

[9]  Jorge Dias,et al.  Vision and Inertial Sensor Cooperation Using Gravity as a Vertical Reference , 2003, IEEE Trans. Pattern Anal. Mach. Intell..

[10]  B. S. Manjunath,et al.  Region of interest extraction and virtual camera control based on panoramic video capturing , 2005, IEEE Transactions on Multimedia.

[11]  Michal Havlena,et al.  Omnidirectional Image Stabilization by Computing Camera Trajectory , 2009, PSIVT.

[12]  Yiannis Aloimonos,et al.  Directions of Motion Fields are Hardly Ever Ambiguous , 2004, International Journal of Computer Vision.

[13]  Ian D. Reid,et al.  Driving saccade to pursuit using image motion , 1995, International Journal of Computer Vision.

[14]  Sebastiano Battiato,et al.  SIFT Features Tracking for Video Stabilization , 2007, 14th International Conference on Image Analysis and Processing (ICIAP 2007).

[15]  Thomas B. Schön,et al.  Modeling and Calibration of Inertial and Vision Sensors , 2010, Int. J. Robotics Res..

[16]  Yiannis Aloimonos,et al.  Active vision , 2004, International Journal of Computer Vision.

[17]  Markus Vincze,et al.  Fast Ego-motion Estimation with Multi-rate Fusion of Inertial and Vision , 2007, Int. J. Robotics Res..

[18]  Bernhard Rinner,et al.  Autonomous Multicamera Tracking on Embedded Smart Cameras , 2007, EURASIP J. Embed. Syst..

[19]  Carlo Tomasi,et al.  Good features to track , 1994, 1994 Proceedings of IEEE Conference on Computer Vision and Pattern Recognition.

[20]  David Salomon,et al.  Transformations and projections in computer graphics , 2006 .

[21]  John P. Snyder,et al.  Map Projections: A Working Manual , 2012 .