CS766 Project 1: High Dynamic Range Imaging

Project Report

Aubrey Barnard

Thursday, September 25, 2008

Description

For this project I chose to work with Matlab. This had the advantage of making it easy to construct and solve the systems of linear equations involved in recovering the imaging system response curve. It had the disadvantage of making everything else harder, partly because of my unfamiliarity with it, and partly because of concrete issues such as saving the HDR image once it was created.

Otherwise, I consider my implementation a straightforward implementation. The inputs are a series of images, the shutter speeds corresponding to those images, and a set of points used to sample the images. The output is a HDR image which can then be tone-mapped by some other software.

How to Run My Project

Since my project is implemented in Matlab, you have to run it from within Matlab. (There is no executable.) Change the Matlab working directory to the one with my Matlab code files. From there, load the images into a cell array. Put the shutter speeds into a matrix. Make a list of points that the program will use to reconnstruct the imaging response curves. Finally, pass those arguments plus a smoothing factor to my program and wait a long time. When it finishes, hopefully you will have a HDR image.

The following Matlab code illustrates this process:

imgnums = 201:-3:163
for i=1:13 name=['../images/bridge/IMG_0' int2str(imgnums(i)) '.JPG']; images{i} = imread(name); end
shutterSpeeds = [1/500 1/250 1/125 1/58.8235 1/30.303 1/15 1/8 1/4 1/2 1 2 4 8];
points = createSamplePoints(images{1}, 10, 8);
hdr = createHDRImage(images, shutterSpeeds, points, 1);

The result is saved to a file called 'image.hdr' in the working directory.

Third-Party Software

The results of this project would not have been possible without some third-party software to work with HDR images. I chose to use pfstools (pfstools.sourceforge.net) for writing HDR images and for HDR tone mapping. (Actually, pfstmo does the tone mapping. It is a sub-project of pfstools which can be found along with pfstools at sourceforge.net/projects/pfstools/.) One of the great benefits of using pfstools was that it integrates into Matlab and Octave. Therefore, I could save HDR images directly from Matlab. Granted there were a few setup headaches at first.

Indeed, I consider this a contribution to the knowledge base of this class. If future students choose to work with Matlab they can be assured of having software the equivalent of gil.

Another feature of the pfstools family of tools is the extraction of image parameters from the EXIF information contained in the image. There is a utility that reads the EXIF information in a set of images and reports it. Each line of the report contains information about one image in the format '<image-file-name> <reciprocal-exposure-time> <aperture-size> <iso-speed> 0'. This could enable automatic input of shutter speeds, although I did not implement this functionality. EXIF information for all my images accompanies each set of images in a file name *.exifinfo.

Results

My program appears to successfully create HDR images from a series of regular images. Unfortunately, it takes a really long time and there are some artifacts. There are some unexpected colors due to the motion of the earth. (Shadows move, stars move, the moon moves.) These are especially apparent in the night images with longer exposure times. There is some slight blurring due to not registering the images, but it is not nearly as severe as the HDR images used in examples of why registration should be done. Due to the high resolution of my images the blur is not noticeable when the images are sampled down. There are some cool artifacts due to moving objects. These objects appear like ghosts, only partially visible. Surprisingly, plants and leaves moving in the breeze do not introduce much blur. This is probably because we percieve leaves as blobs anyway.

The images I want to submit for the image contest are 'bridge.hdr' and 'bridge.png'.

Implementation Details

The only detail of my implementation that deserves mention is that, while I based my imaging curve recovery code on that of Debevec, I reworked the whole procedure using self-documenting variables and comments to demonstrate my understanding. This actually took several hours because I had to infer the contents of the equations in the linear system from the Debevec paper.

Extensions

I did not implement any extensions. I had enough headaches with Matlab and getting third-party software running.

Effort

I worked alone on this project.

Perhaps the grader and/or instructor for this project would be interested in a measure of how much effort the project took to complete. For the interested, this project took me 37.75 hours to complete. While there are many things that influence this amount, one of the main factors was finding and incorporating third-party software. Hence, I was not left with enough time to create extensions.

Significant Learning Outcomes

I have learned several things from completing this project. I have learned a few things about basic photography, including the relationship of f-stop, aperture, and exposure time. I have increased my familiarity with Matlab, and have reviewed basic linear algebra concepts. I have practiced compiling (and troubleshooting) third-party software. I have also practiced translating mathematical abstractions in papers into working code. These outcomes are valuable in their own right, but, unfortunately, all this has little to do with what I would consider important ideas, concepts, and algorithms relating to computer vision.