View interpolation is the process of creating a sequence of synthetic images that, taken together, represent a smooth transition from one view of a scene to another view. Two themes have dominated my work in this area: (1) utilizing only two original views of the scene to generate the sequence and (2) allowing for scenes that contain motion (called dynamic scenes).
Typically, view interpolation algorithms also allow for extrapolation (i.e., continuing to generate synthetic images beyond the end of the interpolation sequence); this is true of all of my algorithms.
My first view-interpolation research extended Steve Seitz's view morphing technique to work with certain kinds of dynamic scenes; the work was called dynamic view morphing (see Russell A. Manning and Charles R. Dyer, Interpolating view and scene motion by dynamic view morphing, Proc. Computer Vision and Pattern Recognition, Fort Collins, Colorado, June 1999, pages I:388-394).
The Problem: Given two views of a dynamic scene, taken by different cameras in different locations at different times, create a sequence of views showing a smooth transition between the two camera views that also shows the dynamic scene changing smoothly.
Example: The left and right images below are photographs of the desk in my office, taken from two different positions and with two different focal lengths. All of the objects in the scene (the books, papers, computer monitor, etc.) remained stationary EXCEPT for the orange-colored box in the foreground. The box is photographed in two different positions, although in both positions it is flush with the edge of the table. From the original reference views it is impossible to know exactly how the box moved during the missing time. However, a reasonable guess is that the box slid in a straight line along the edge of the table. If the object had been a car driving across a bridge, then it would have been crucial to portray the car traveling in a straight line.
|
Input View 1 |
Synthesized View |
Input View 2 |
|---|---|---|
Entire sequence is available as a movie: [0.4M mpeg] [9.9M gzipped QuickTime]
Because of the robustness of my approach, it is possible to "paste" a synthetic object onto the two reference views so that it looks approximately correct for the scene and then to interpolate between the augmented views. In the example below, a synthetic truck object is added to look like it rests on the table top in both reference views. The truck changes position between the reference views; the interpolation sequence shows the motion of the synthetic object.
|
Input View 1 |
Synthesized View |
Input View 2 |
|---|---|---|
Entire sequence is available as a movie: [0.1M mpeg] [1.5M SGI Cosmo] [6.3M QuickTime]
As originally published, the view morphing algorithm did not allow for camera motion into or out of the viewing window (i.e., the epipole between the two views could not be contained in either view). Because dynamic scenes would often contain objects moving towards or away from the camera, and in general, because allowing camera motion towards or away from the view was desirable, I created the environment map morphing technique. Put simply, an environment map is a panoramic or wide-angle view; in the extreme, an enviroment map can be the entire sphere around the camera. Environment map morphing makes it possible to perform view interpolation between two arbitrary, uncalibrated environment maps while preserving the scanline property.
|
First Frame |
Synthesized Middle Frame |
Last Frame |
|---|---|---|
Entire sequence is available as a movie: [0.5M mpeg] [4.8M SGI Cosmo] [13M QuickTime]