Project 2: Assembling Panoramic Images
Jaclyn Beck
Introduction
Panoramic imaging is useful for capturing scenes that are larger than
the normal field of view of the camera. A panoramic scene can be
assembled by taking multiple pictures of the scene and stitching them
together. The more the camera moves, however, the more the outer images
must be distorted to stitch the scene together in a planar fashion. The
solution to this is to warp each image in the scene to a cylinder,
stitch the warped images together, and 'unwrap' the cylinder. This
distorts each image slightly but not to the extent that planar stitching
would do.
Methods
I used a Canon Powershot SX100 IS, a tripod, and a Kaidan mount to
gather my images. I took 18 images by rotating the camera, covering a
full 360 degrees. The images were named in numerical order, starting
with '1.jpeg' and ending with '18.jpeg'. The first image was copied and
the copy was named '19.jpeg', so that the program would append it to the
last image in the sequence for me. Each image was then warped using
inverse cylindrical warping:
x' = f*tan((x-xCenter)/f) + xCenter;
y' = (y-yCenter)/cos((y-xCenter)/f) + yCenter;
where x and y are coordinates in the warped image and f is the focal
length of the camera in pixels. The pixel color of the warped image at
coordinates (x,y) is the color of the pixel at (x',y') in the original
image.
After each image is warped, I use SIFT to extract feature points from
each image. SIFT comes with a Matlab script that lets you call the SIFT
program directly from Matlab, so I call the script every time an image
is warped. The feature descriptors and locations are saved into two text
files, numbered to match the image they describe. My program then
iterates through each image, matching its features with the next image
in the sequence, and stitches those two images together. I decided to
use the same matching function that SIFT uses, namely I decide if
something is a match by taking the dot products of one image's
descriptors with another image's descriptors and comparing the best two
results with each other. I was originally going to use Euclidean
distance but this turned out to be far too slow, while SIFT's method was
fast and accurate.
To get the average translation vector between one image and another, I
used the RANSAC method. Translations between features were randomly
selected, and the number of translation vectors that were within two
pixels of the original vector were counted as inliers. The largest group
of inliers was used and averaged to get the average translation vector
from one image to another. After this vector was calculated, the second
image was placed on top of the first image, using the vector to
translate it to the proper position. The two images were blended
together with a simple linear transition between the first image and the
second. This group of actions was repeated with each pair of images in
the sequence until the final image was stitched into the panorama. I ran
this program on both my set of images and the test images.
Finishing Touches and Results
There was a good amount of drift in my set of images and the test
images, so the panoramas assembled by my program still needed cropping
and adjusting. I cropped both panoramas with GIMP, such that the first
image in the panorama was split down the middle (on the left and right
side of the panorama) so that the image would be seamless if the ends
were joined. I then wrote a function to calculate the slope of the drift
and adjust the panorama's y values accordingly. Finally, I cropped each
panorama again to cut out the curved edges from the cylindrical warp, so
that the images showed no signs of seaming.
Here is the panorama from the test images (click here for larger version):

Here is the panorama from my set of images (click here for larger version):

Click here for
interactive viewer.
I think both images turned out very nicely, although there is a
mysterious ghost tree on the left of the second panorama, because SIFT
had trouble finding feature matches in those images. I think this is
because the sun was shining directly into the camera and it was a little
bit windy, making the two pictures of the tree just different enough
that they couldn't be matched. That being said, the rest of the blending
and matching between images looks really good, and the second panorama
(from my set of images) is my favorite of the two.
Running my Program
To run my program, you simply need to open Matlab in the folder with my
program files and run makePanorama(photos, outputDir,
numPhotos), where photos is the directory where the photos
are located, outputDir is where you want all of the warped
images, descriptor files, and panoramas to go, and numPhotos is
the number of photos in the sequence (not counting the duplicated first
photo at the end). This will output several temporary panorama images as
they are stitched together, but the very last panorama made will be the
one you want. You can then crop it in GIMP as described above, save it,
and adjust it by running adjustPanorama(filename,
outputFilename) in Matlab.