Computer Science - The City College of New York
CSC I6716 - Fall 2010  3D Computer Vision

Assignment 4. 
Stereo and Motion ( Deadline: Nov 29  before class)
(Those marked with * are optional for extra credits)


Note:  Turn in a document (in writing) containing your writing part of the assignment, a listing of your .m files, probably images showing the results of your experiments, and definitely an analysis of the results. All the writings must be hard copies in print. You also need to turn in your “soft” copies of your assignment by sending me email attachments. Send your source code ONLY  – please don't send in your images and executable (if you use C++).  You are responsible for the lose of your submissions if you don't write  “CSC I6716 Computer Vision Assignment 4” in the subject of your email. Do write your **names** and IDs (last four digits) in both your hard copy and soft copy submissions.


1.  (Stereo- 15 points ) Estimate the accuracy of  the simple stereo system (Figure 7.4 in Trucco & Verri’s book) assuming that the only source of noise is the localization of corresponding points in the two images. Discuss the dependence of the error in depth estimation as a function of the baseline width and the focal length.

Hint: Take the partial derivatives of Z with respect to x, T, f respectively.

2. (Motion- 20 points) Could you obtain 3D information of a scene by viewing the scene by a camera  rotating around its optical center? Show why or why not. What about moving the camera along its optical axis?

3. (Motion- 15 points) Show that the aperture problem can be solved if a corner is visible through the aperture.

4. (Stereo Programming - 50 basic points  + 20 extra points ) Use the image pair ( Image 1, Image 2) for the following exercises.

(1). Fundamental Matrix. - Design and implement a program that , given a stereo pair, determines at least eight point matches, then recovers the fundamental matrix (15 points ) and the location of the epipoles (5 points). Check the accuracy of the result by measuring the distance between the estimated epipolar lines and image points not used by the matrix estimation (5 points). Also, overlay the epipolar lines of control points and test points on one of the images (say Image 1- I already did this in the starting code below). Control points are the correspondences (matches)  used in computing the fundamental matrix,  and test points are those  used to check the accuracy of the computation.

Hint: As a first step, you can pick up the matches of both the control points and the test points manually. You may use my matlab code (FmatGUI.m)  as a starting point - where I provided an interface to pick up point matches by mouse clicks. The epipolar lines should be (almost)  parallel in this stereo pair. If not, something is wrong either with your code or the point matches. Make sure this is achieved before you move to the second step* - that is to try to search for point matches automatically by your program. However the second step is optional (for extra 5 points)

(2). Feature-based matching. - Design a stereo vision system to do "feature-based matching" and explain your algorithm in writing (10). The system should have a user interface that allows a user to select a point on the first image, say by a mouse click (5 points).  The system should then find and highlight the corresponding point on the second image, say using a cross hair (5 points). Try to use the epipolar geometry derived from (1) in searching  correspondences along epipolar lines (5 points).

Hint : You may use a similar interface  as I did for question (1). You may use the point match searching algorithm in (1) (if you have done so), but this time you need to constrain your search windows along the epipolar lines.

(3) *Discussions. Show your results on points with different properties like those in corners, edges, smooth regions, textured regions, and occluded regions that are visible only in one of the images (5 points). Discuss for each case, why your vision system succeeds or fails in finding the correct matches (5 points). Compare the performance of your system against a human user (e.g. yourself) who marks the corresponding matches on the second image by a mouse click (5 points).