OMNIDIRECTIONAL IMAGING

Prof. George Wolberg
Dept. of Computer Science
City College of New York / CUNY
wolberg@cs.ccny.cuny.edu


Introduction | Commercial Benefits | Related Work | Cylindrical Mapping | Parabolic Mapping | Hybrid Mapping | Future Work

Introduction

Omnidirectional imaging: There are three chief benefits of omnidirectional imaging model proposed here:

Commercial Benefits


Statement of the Problem


Related Work


Cylindrical Mapping

The first source of omnidirectional images is from the use of multiple images. An example is shown in the figure above. The top row of the figure shows a 360 degree panoramic (mosaic) image produced by stitching together several pictures acquired from a camera pivoting on a tripod. The images have some slight overlap (approximately 10 percent) to enable the mosaic software to automatically stitch them together into a seamless band of images.

We may interpret that image as a texture that may be mapped onto a cylinder. Any rendered view of the scene around us is simply a matter of projecting the cylindrical image onto a viewplane representing the camera film plane. The leftmost image in the bottom row of the figure depicts one such view. The rightmost image in that row depicts the virtual camera (yellow icon) that is centered in the cylinder and oriented towards the associated cylindrical region. Our software permits the user to rotate his head in any direction, move the camera toward/through the cylinder wall, and change the field of view. This yields a totally immersive sensation as long as the viewer looks exclusively at the cylinder and not at the hole above and below the cylinder walls.


Parabolic Mapping

The second source of omnidirectional images is from catadioptic sensors that incorporate reflecting surfaces (mirrors) into a conventional imaging system that use lenses. The field of view of a catadioptric system can be varied by changing the shape of the mirror it uses. However, the entire imaging system must have a single effective viewpoint to permit the generation of pure perspective images from a sensed image. A popular approach is to use a parabolic mirror, whereby the single viewpoint lies at the focus of the parabola. The top row of the figure above depicts an image taken with such a system. We may interpret that image as a texture that may be mapped onto a parabolic mirror using an orthographic projection. Any rendered view of the scene around us is simply a matter of projecting the parabolic image onto a viewplane representing the camera film plane. The leftmost image in the bottom row of the figure depicts one such view. The rightmost image in that row depicts the virtual camera (yellow icon) that is centered in the parabolic mirror and oriented towards the associated parabolic region.

All points on the catadioptric image are formed as the result of an orthographic projection of the scene reflecting off of the parabolic mirror. All image data that there is to know about the scene is contained in the warped catadioptric image. In order to compute a proper perpective view from the catadioptric image, the viewer, who is considered to lie at the focus of the parabola, is allowed to look out onto the scene along some viewing direction. A virtual film plane is centered along that direction and the values of all pixels on the plane are computed. This involves extending a ray from the focus towards the 3D point that constitutes the position of the film plane pixel. In order to determine the color of the point, the ray is extended from the focus until it intersects the parabolic mirror at position z = [r*r - (x*x + y*y)] / 2r*r where r is the radius of the parabola, e.g., half-width of the catadioptric image. The scene value associated with that line of sight has been reflected straight up into the camera at location (x,y) in the catadioptric image. Pixel (x,y) from that image is read and assigned to the corresponding pixel on the virtual film plane to form the true perspective (unwarped) output image, as shown in the lower left image above. As in the case for the cylindrical images, our software permits the user to rotate his head in any direction, move the camera toward/through the parabola, and change the field of view.


Hybrid Mapping

Cylindrical and parabolic mappings each yield a totally immersive sensation as long as the viewer looks exclusively at the cylinder or parabolic dish. Once the geometric boundaries of the cylinder and parabola are reached, there is no data to display to the viewer and one's sense of immersion is compromised. We demonstrate a hybrid approach in which we cap the cylinder with a parabola to eliminate any boundaries in the geometry, thereby recreating a fully immersive sensation.

We simulate this hybrid mapping by modeling an aircraft carrier and planes. The top and side views of the simulated carrier are shown below.

We simulate the views taken from the use of multiple cameras as well as catadioptric sensors. The figure below shows six nonoverlapping images of the aircraft carrier taken for the purpose of cylindrical mapping.

A cylindrical mapping of these six images is shown below.

In order to cap both ends of the cylinder, a catadioptric sensor is placed above and below the multi-camera assembly to obtain the following images below the hybrid sensor and above it (pointing to the sky).

The catadioptric output can be unwarped by performing the parabolic mapping. The parabolic mirror used in the catadioptric sensors are approximated with piecewise planar facets. These facets are shown overlaid on the parabolic mirror in the right side of the figure below. The left side of the figure shows the view taken from our synthetic camera looking onto the parabolic mirror. Notice that the planar projection has eliminated the warping effect of the curved mirror.

In this manner, the viewer can orient their line of sight in any direction and the corresponding view is displayed in real-time. Since the cylindrical data is stitched seamlessly with the catadioptric (parabolic) data, there is a smooth continuity of display in all orientations.


Future Work


Back to top | Introduction | Commercial Benefits | Related Work | Cylindrical Mapping | Parabolic Mapping | Hybrid Mapping | Future Work |