Removing blur due to camera shake from images.
description
Transcript of Removing blur due to camera shake from images.
Removing blur due to camera shake from images.
William T. Freeman
Joint work with Rob Fergus, Anat Levin, Yair Weiss, Fredo Durand, Aaron Hertzman, Sam Roweis, Barun Singh
Massachusetts Institute of Technology
Overview
Original Our algorithm
Close-up
Original Naïve Sharpening Our algorithm
Let’s take a photo
Blurry result
Slow-motion replay
Slow-motion replay
Motion of camera
Image formation process
= ⊗
Blurry image Sharp image
Blur kernel
Input to algorithm Desired output
ConvolutionoperatorModel is approximation
Why is this hard?
Simple analogy:11 is the product of two numbers.What are they?
No unique solution: 11 = 1 x 1111 = 2 x 5.511 = 3 x 3.667 etc…..
Need more information !!!!
Multiple possible solutions
= ⊗
Blurry image
Sharp image Blur kernel
= ⊗
= ⊗
Is each of the images that follow sharp or blurred?
Another blurry one
Natural image statistics
Histogram of image gradients
Characteristic distribution with heavy tails
Blury images have different statistics
Histogram of image gradients
Parametric distribution
Histogram of image gradients
Use parametric model of sharp image statistics
Uses of natural image statistics
• Denoising [Roth and Black 2005]
• Superresolution [Tappen et al. 2005]
• Intrinsic images [Weiss 2001]
• Inpainting [Levin et al. 2003]
• Reflections [Levin and Weiss 2004]
• Video matting [Apostoloff & Fitzgibbon 2005]
Corruption process assumed known
Existing work on image deblurring
Software algorithms:– Extensive literature in signal processing
community– Mainly Fourier and/or Wavelet based– Strong assumptions about blur
not true for camera shake
– Image constraints are frequency-domain power-laws
Assumed forms of blur kernels
A focus on image constraints, not image priors
Some image constraints/priors
Toy example: observed “image”:
1.0
0.0
1.0
0.0
Toy example: observed “image”:
Toy example: observed “image”:
1.0
0.0
Three sources of information
1. Reconstruction constraint:
=⊗
Input blurry imageEstimated sharp imageEstimatedblur kernel
3. Blur prior:
Positive&
Sparse
2. Image prior:
Distribution of gradients
Prior on image gradients (mixture of Gaussians giving a
Laplacian-like distribution)
Distribution of gradients (log-scale)Green curve is our mixture of gaussians fit.
Prior on blur kernel pixels (mixture of exponentials)
b
P(b)
How do we use this information?
Obvious thing to do:– Combine 3 terms into an objective function
– Run conjugate gradient descent
– This is Maximum a-Posteriori (MAP)
Maximum A-Posterioriy – observed blurry imagex – unobserved sharp imageb – blur kernel
i – image patch indexf – derivative filter
Likelihood Latent image prior Blur prior
Assumption: all pixels independent of one another
Sparse and
Results from MAP estimation
Maximum a-Posteriori (MAP)Our method: Variational Bayes
Input blurry image
Variational Bayes
http://citeseer.ist.psu.edu/cache/papers/cs/16537/http:zSzzSzwol.ra.phy.cam.ac.ukzSzjwm1003zSzspringer_chapter8.pdf/miskin00ensemble.pdf
Miskin and Mackay, 2000
Setup of variational approachNeed likelihood and prior in same space, so use gradients:
Likelihood
Prior on latent image gradients – mixture of Gaussians
Prior on blur elements – mixture of Exponentialsi – image pixelj – blur pixel
We use C=4, D=4
Also have Gamma hyperpriors on
Variational inference• Approximate posterior with
• Cost function
• Assume
• Use gradient descent, alternating between updating while marginalizing out over and vice versa
• Adapted code from Miskin & Mackay 2000
is Gaussian on each pixel
is rectified Gaussian on each pixel
Variational Bayesian method
Based on work of Miskin & Mackay 2000
Keeps track of uncertainty in estimates of image and blur by using a distribution instead of a single estimate
Helps avoid local maxima and over-fitting
Variational Bayes
Variational Bayesian method
Maximum a-Posteriori (MAP)
Pixel intensity
Score
Objective function for a single variable
MAP vs Variational
MAP
Variational
MAP using variational initialization
Blurry synthetic image
Inference – initial scale
Inference – scale 2
Inference – scale 3
Inference – scale 4
Inference – scale 5
Inference – scale 6
Inference – final scale
Our output
Ground truth
Matlab’s deconvblind
True kernel Estimated kernel
Tried the same algorithm on an image of real camera blur, with very similar blur kernel
Failure!
Whiteboard scene:
Does camera shake give a stationary blur kernel?
8 different people, handholding camera, using 1 second exposure
View of the dots at each corner in photos taken by 4 people
Topleft
Bot.left
Topright
Bot.right
Person 1
Person 3 Person 4
Person 2
Tonescale: output from camera
Linear response to light intensities looks like this
Overview of algorithm
Input image
1. Pre-processing
2. Kernel estimation- Multi-scale approach
3. Image reconstruction- Standard non-blind deconvolution
routine
Preprocessing
Convert tograyscale
Input image
Remove gammacorrection
User selects patch from image
Bayesian inference too slow to run on whole image
Infer kernel from this patch
InitializationInput image
Initialize 3x3 blur kernel
Initial blur kernelBlurry patch Initial image estimate
Convert tograyscale
Remove gammacorrection
User selects patch from image