CCNY Spring 2003  3D Computer Vision and Video Computing  Course Project

Omnidirectional Stereo (OmniStereo) Vision Simulation 
-  Comparison of Fixed Baseline, Circular and Dynamic Omnistereo Systems -

Students: Chengwu Chai anf Feng Jin
Instructor: Professor Zhigang Zhu

(Scroll down to the end of the page for a description of how to use it)




In the following Java simulation, we compare the depth errors of three omnistereo configurations when tracking a moving subject in 360 degrees: the fixed viewpoint omnistereo (case 1), the circular projection omnistereo (both asymmetric and symmetric – cases 2 and 3) and the dynamic omnistereo (case 4).  Note that the differences in depth errors in both directions and in distance.  In case 1 (the fixed viewpoint omnistereo with a fixed baseline length B), depth error is very large when the target aligns with the two viewpoints (indicated by the out-of-range dark red error bars). This is improved when the second camera moves around the first camera, as in case 2 (the asymmetric circular omnistereo), where the radius of the viewing circle is B.  The distance error is isotropic, even if it is still proportional to the square of distance. Further if the two cameras can move along the same viewing circle (case 3- the symmetric circular omnistereo), viewing opposite directions, then the depth errors are reduced to half. In all the three cases, the depth error is proportional to the square of distance. Finally, in case 4 (the dynamic stereo), the error map is drawn when we assume that the diameter of calibration cylindrical body of each robot (camera) is 2B, and the second camera is moving in order to obtain the best triangulation configurations. Note that we use the cylindrical body to calibrate the moving camera in this simulation. This is a fair comparison since 2B is also the baseline length in the symmetric circular omnistereo simulation. In the dynamic omnistereo case, the viewpoints of the second camera lie on a object-dependent curve, with the rays of the moving camera tangent to the viewing curve. The baseline lengths as well as the viewpoint locations are adaptively changed with the distances of the target. As the distance error is only a function of 1.5 order of the target’s distance in stead of 2, the errors are smaller in case 4 than in case 3 when the target is farther from the first camera. In this comparison, the turn-point of the distance is 8 times of the radius of the calibration cylinders in case 4 (and also the viewing circle in case 3).

How to use it:
(1) . Input the measurement error (in degrees). It represents the localization error of the object in an omnidrectional image both for calibration and traingulation. Press Enter after you fill in the number.

(2) Number of Graphs. You can select to show 1-4 cases.

(3) In the top-left sub-window (case 1), Hold the left mouse botton and drag the mouse, a trajectory will be drwan in this window. Same trajectories will be drwan in the rest of all the sub-windows.

(4) Clear all the drawings by click CLEAR button.

References

[1]   Zhigang Zhu, Omnidirectional stereo vision, Workshop on Omnidirectional Vision Applied to Robotic Orientation and Nondestructive Testing (NDT), The 10th IEEE International Conference on Advanced Robotics, August 22-25, 2001, Budapest, Hungary (invited talk)
[2]     H.-Y. Shum, A. Kalai and S. M. Seitz, “Omnivergent stereo,”  in Proc. ICCV’99, pp 22 – 29, September 1999
[3]     Z. Zhu, K. D. Rajasekar, E. M. Riseman, A. R. Hanson, Panoramic virtual stereo vision of cooperative mobile robots for localizing 3D moving objects, IEEE Workshop on Omnidirectional Vision, Hilton Head Island, June 12, 2000, pp. 29-36.
[4]     Zhu, Z., K. D. Rajasekar, E. M. Riseman and A. R. Hanson, 3D localization of multiple moving people by an omnidirectional stereo system of cooperative mobile robots, Technical Report TR #00-14, Computer Science Department., University of Massachusetts at Amherst, March, 2000

Back to Zhigang's Homepage | Computer Science | School of Engineering | CCNY