High-quality Video Denoising for Motion-based Exposure Control

Travis Portz Li Zhang Hongrui Jiang
(a) Constant Exposure (b) Motion-based Auto Exposure (c) Denoised Result of (b)

Traditional cameras set exposure time based on scene brightness; camera or subject motion leads to motion blur (a). New cameras (e.g., Canon SD1100 and Nikon COOLPIX S8100) compenstate exposure time for photo capture using motion detection; large motion results in noisy images with short exposure and high ISO. We build a prototype camera system that uses this idea in video capture (b) and propose a new method for video denoising in the context of motion-based exposure control (c). Our method outperforms previous methods by exploiting high quality frames in a video to enhance low quality frames and selectively operating in whichever regime state-of-the-art spatial or temporal denoising methods work best.

Related project: Optical Flow in the Presence of Spatially-Varying Motion Blur

Paper

Travis Portz, Li Zhang, Hongrui Jiang. High-quality Video Denoising for Motion-based Exposure Control. IEEE International Workshop on Mobile Vision, November 2011. [PDF 2.8MB] Best Paper Honorable Mention.

Presentation

Slides with video [ZIP 12MB]
Slides only [PPTX 5.8MB]

Acknowledgments

This work is supported in part by NSF EFRI-0937847, NSF IIS-0845916, NSF IIS-0916441, a Sloan Research Fellowship, and a Packard Fellowship for Science and Engineering. Travis Portz is also supported by a University of Wisconsin-Madison, Department of Electrical and Computer Engineering graduate fellowship.

IWMV 2011 Additional Results

*** Please enable JavaScript in your browser to view our results. Flash Player is required to watch the videos.
*** It has been tested on PC and Mac using Firefox, Safari, Chrome, and IE9 (32-bit).

Synthetic Videos

Real Videos

If optical flow is involved in comparing denosing performance, we use our flow as input for both our denoising and others (e.g. Liu and Freeman [13]) for fair comparison, unless otherwise specified.