Image-based navigation in real environments using panoramas

We present a system for virtual navigation in real environments using image-based panorama rendering. Multiple overlapping images are captured using a Point Grey Ladybug camera and a single cube-aligned panorama image is generated for each capture location. Panorama locations are connected in a graph topology and registered with a 2D map for navigation. A real-time image-based viewer renders individual 360-degree panoramas using graphics hardware acceleration. Real-world navigation is performed by traversing the graph and loading new panorama images. The system contains a user-friendly interface and supports standard input and display or a head-mounted display with an inertial tracking device.

[1]  D. Southwell,et al.  Omni-directional sensors for pipe inspection , 1995, 1995 IEEE International Conference on Systems, Man and Cybernetics. Intelligent Systems for the 21st Century.

[2]  Steven M. Seitz,et al.  View morphing , 1996, SIGGRAPH.

[3]  Mark Fiala Immersive panoramic imagery , 2005, The 2nd Canadian Conference on Computer and Robot Vision (CRV'05).

[4]  Anup Basu,et al.  Hough transform for feature detection in panoramic images , 2002, Pattern Recognit. Lett..

[5]  Shree K. Nayar,et al.  A theory of catadioptric image formation , 1998, Sixth International Conference on Computer Vision (IEEE Cat. No.98CH36271).