Uncertainty Quantification in Bayesian Inversion€¦ · Bayesian Statistics Setting: Observation:...

14
Uncertainty Quantification in Bayesian Inversion Jonas Latz Technische Universität München Mathematics Faculty Chair of Numerical Mathematics (M2) November 17, 2016

Transcript of Uncertainty Quantification in Bayesian Inversion€¦ · Bayesian Statistics Setting: Observation:...

Page 1: Uncertainty Quantification in Bayesian Inversion€¦ · Bayesian Statistics Setting: Observation: data point y 2Rn, Parameterized distribution of y: P(y 2ju), given by a Lebesgue-density

Uncertainty Quantification in Bayesian InversionJonas LatzTechnische Universität MünchenMathematics FacultyChair of Numerical Mathematics (M2)November 17, 2016

Page 2: Uncertainty Quantification in Bayesian Inversion€¦ · Bayesian Statistics Setting: Observation: data point y 2Rn, Parameterized distribution of y: P(y 2ju), given by a Lebesgue-density

Outline• Brief introduction to Bayesian Statistics and Bayesian Inverse Problems• Andrew M. Stuart - Uncertainty Quantification in Bayesian Inversion.

Jonas Latz (TUM) | Reading Group Uncertainty Quantification (M2) | November 17, 2016 2

Page 3: Uncertainty Quantification in Bayesian Inversion€¦ · Bayesian Statistics Setting: Observation: data point y 2Rn, Parameterized distribution of y: P(y 2ju), given by a Lebesgue-density

Bayesian StatisticsSetting:

• Observation: data point y ∈ Rn,• Parameterized distribution of y : P(y ∈ ·|u), given by a Lebesgue-density

dP(y ∈ ·|u)

dλ n (x) =: L(x |u),

• Parameter u ∈ X .

Task: Identify the parameter u based on the observed data point y .

Classical Inference: Maximum likelihood estimate: u :∈ argmaxu∈X [logL(y |u)]

Bayesian Inference: The estimate u is now modelled as a random variable, reflectingknowledge about u.

Prior: µprior := P(u ∈ ·) (knowledge/distribution before seeing the data)Posterior: µpost := P(u ∈ ·|y) (knowledge/distribution after seeing the data)

Jonas Latz (TUM) | Reading Group Uncertainty Quantification (M2) | November 17, 2016 3

Page 4: Uncertainty Quantification in Bayesian Inversion€¦ · Bayesian Statistics Setting: Observation: data point y 2Rn, Parameterized distribution of y: P(y 2ju), given by a Lebesgue-density

Bayesian StatisticsSetting:

• Observation: data point y ∈ Rn,• Parameterized distribution of y : P(y ∈ ·|u), given by a Lebesgue-density

dP(y ∈ ·|u)

dλ n (x) =: L(x |u),

• Parameter u ∈ X .

Task: Identify the parameter u based on the observed data point y .

Classical Inference: Maximum likelihood estimate: u :∈ argmaxu∈X [logL(y |u)]

Bayesian Inference: The estimate u is now modelled as a random variable, reflectingknowledge about u.

Prior: µprior := P(u ∈ ·) (knowledge/distribution before seeing the data)Posterior: µpost := P(u ∈ ·|y) (knowledge/distribution after seeing the data)

Jonas Latz (TUM) | Reading Group Uncertainty Quantification (M2) | November 17, 2016 3

Page 5: Uncertainty Quantification in Bayesian Inversion€¦ · Bayesian Statistics Setting: Observation: data point y 2Rn, Parameterized distribution of y: P(y 2ju), given by a Lebesgue-density

Bayesian StatisticsSetting:

• Observation: data point y ∈ Rn,• Parameterized distribution of y : P(y ∈ ·|u), given by a Lebesgue-density

dP(y ∈ ·|u)

dλ n (x) =: L(x |u),

• Parameter u ∈ X .

Task: Identify the parameter u based on the observed data point y .

Classical Inference: Maximum likelihood estimate: u :∈ argmaxu∈X [logL(y |u)]

Bayesian Inference: The estimate u is now modelled as a random variable, reflectingknowledge about u.

Prior: µprior := P(u ∈ ·) (knowledge/distribution before seeing the data)Posterior: µpost := P(u ∈ ·|y) (knowledge/distribution after seeing the data)

Jonas Latz (TUM) | Reading Group Uncertainty Quantification (M2) | November 17, 2016 3

Page 6: Uncertainty Quantification in Bayesian Inversion€¦ · Bayesian Statistics Setting: Observation: data point y 2Rn, Parameterized distribution of y: P(y 2ju), given by a Lebesgue-density

Bayesian StatisticsSetting:

• Observation: data point y ∈ Rn,• Parameterized distribution of y : P(y ∈ ·|u), given by a Lebesgue-density

dP(y ∈ ·|u)

dλ n (x) =: L(x |u),

• Parameter u ∈ X .

Task: Identify the parameter u based on the observed data point y .

Classical Inference: Maximum likelihood estimate: u :∈ argmaxu∈X [logL(y |u)]

Bayesian Inference: The estimate u is now modelled as a random variable, reflectingknowledge about u.

Prior: µprior := P(u ∈ ·) (knowledge/distribution before seeing the data)

Posterior: µpost := P(u ∈ ·|y) (knowledge/distribution after seeing the data)

Jonas Latz (TUM) | Reading Group Uncertainty Quantification (M2) | November 17, 2016 3

Page 7: Uncertainty Quantification in Bayesian Inversion€¦ · Bayesian Statistics Setting: Observation: data point y 2Rn, Parameterized distribution of y: P(y 2ju), given by a Lebesgue-density

Bayesian StatisticsSetting:

• Observation: data point y ∈ Rn,• Parameterized distribution of y : P(y ∈ ·|u), given by a Lebesgue-density

dP(y ∈ ·|u)

dλ n (x) =: L(x |u),

• Parameter u ∈ X .

Task: Identify the parameter u based on the observed data point y .

Classical Inference: Maximum likelihood estimate: u :∈ argmaxu∈X [logL(y |u)]

Bayesian Inference: The estimate u is now modelled as a random variable, reflectingknowledge about u.

Prior: µprior := P(u ∈ ·) (knowledge/distribution before seeing the data)Posterior: µpost := P(u ∈ ·|y) (knowledge/distribution after seeing the data)

Jonas Latz (TUM) | Reading Group Uncertainty Quantification (M2) | November 17, 2016 3

Page 8: Uncertainty Quantification in Bayesian Inversion€¦ · Bayesian Statistics Setting: Observation: data point y 2Rn, Parameterized distribution of y: P(y 2ju), given by a Lebesgue-density

Bayesian Statistics: How to derive the posterior?Under further assumptions, the posterior can be derived using Bayes’ formula:

dµpost

dµprior(u) =

L(y |u)∫L(y |u)dµprior(u)

=:L(y |u)

Z (y).

Assume u ∈ X = Rk and µprior has a pdf fprior, then:

fpost(u) =L(y |u)∫

L(y |u)dµpriorfprior(u) =

L(y |u)

Z (y)fprior(u)

.

Jonas Latz (TUM) | Reading Group Uncertainty Quantification (M2) | November 17, 2016 4

Page 9: Uncertainty Quantification in Bayesian Inversion€¦ · Bayesian Statistics Setting: Observation: data point y 2Rn, Parameterized distribution of y: P(y 2ju), given by a Lebesgue-density

Bayesian Statistics: How to derive the posterior?Analytical: Only possible given specific prior + likelihood pairs. (Conjugate priors)

Computational: Produce (weighted) samples of the posterior to approximate itempirically.• Importance Sampling (requires to estimate Z (y))• Markov Chain Monte Carlo (MCMC; does not yield independent samples of

µpost)• Sequential Monte Carlo (SMC; efficient, even if posterior is multimodal or

concentrated; requires to estimate Z (y))

Jonas Latz (TUM) | Reading Group Uncertainty Quantification (M2) | November 17, 2016 5

Page 10: Uncertainty Quantification in Bayesian Inversion€¦ · Bayesian Statistics Setting: Observation: data point y 2Rn, Parameterized distribution of y: P(y 2ju), given by a Lebesgue-density

Bayesian Statistics: How to derive the posterior?Analytical: Only possible given specific prior + likelihood pairs. (Conjugate priors)Computational: Produce (weighted) samples of the posterior to approximate it

empirically.• Importance Sampling (requires to estimate Z (y))• Markov Chain Monte Carlo (MCMC; does not yield independent samples of

µpost)• Sequential Monte Carlo (SMC; efficient, even if posterior is multimodal or

concentrated; requires to estimate Z (y))

Jonas Latz (TUM) | Reading Group Uncertainty Quantification (M2) | November 17, 2016 5

Page 11: Uncertainty Quantification in Bayesian Inversion€¦ · Bayesian Statistics Setting: Observation: data point y 2Rn, Parameterized distribution of y: P(y 2ju), given by a Lebesgue-density

Inverse Problem• G : X → Y is the forward response operator,• η ∼ N(0,Γ) is noise,• utruth ∈ X is the true model parameter,• y ∈ Y is (noisy) observed data of the model, i.e. given by y := G (utruth) + η .

Identify the parameter utruth, based on the data y .

(a) True Parameter (b) Estimation

Figure: log-Permeability of an Oil Reservoir

Jonas Latz (TUM) | Reading Group Uncertainty Quantification (M2) | November 17, 2016 6

Page 12: Uncertainty Quantification in Bayesian Inversion€¦ · Bayesian Statistics Setting: Observation: data point y 2Rn, Parameterized distribution of y: P(y 2ju), given by a Lebesgue-density

Bayesian Inverse ProblemLet u ∼ µ0 := µprior and (u,η) independent. Then,

G (u) + η = y ⇔ η = y −G (u),

and therefore,y −G (u)∼ N(0,Γ).

The likelihood is then given by

L(y |u) := φ0,Γ(y −G (u)) := exp(−12‖Γ−1

2(y −G (u))‖22)

The posterior µy := µpost is then given by Bayes’ formula:

µy =

1Z (y)

exp(−12‖Γ−1

2(y −G (·))‖22)µ0.

Jonas Latz (TUM) | Reading Group Uncertainty Quantification (M2) | November 17, 2016 7

Page 13: Uncertainty Quantification in Bayesian Inversion€¦ · Bayesian Statistics Setting: Observation: data point y 2Rn, Parameterized distribution of y: P(y 2ju), given by a Lebesgue-density

References[1] Dashti, M. and Stuart, A.M. (2015) - The Bayesian Approach to Inverse

Problems, Handbook of Uncertainty Quantification, Springer.[2] Iglesias, M.A., Law, K.J.H. and Stuart, A.M. (2013) - Ensemble Kalman

methods for inverse problems, IOPscience[3] Latz, J. (2016) - Bayes Linear Methods for Inverse Problems, Master’s Thesis

University of Warwick.[4] K.J.H. Law, Stuart, A.M. and Zygalakis, K.C. - Data Assimilation: A

Mathematical Introduction, Springer.[5] Schillings, C. and Stuart, A.M. (2016) - Analysis of the ensemble Kalman filter

for inverse problems, preprint.[6] Stuart, A.M. (2010) - Inverse problems: a Bayesian perspective, Acta

Numerica 19.[7] Sullivan, T.J. (2015) - Introduction to Uncertainty Quantification, Springer.

Jonas Latz (TUM) | Reading Group Uncertainty Quantification (M2) | November 17, 2016 8

Page 14: Uncertainty Quantification in Bayesian Inversion€¦ · Bayesian Statistics Setting: Observation: data point y 2Rn, Parameterized distribution of y: P(y 2ju), given by a Lebesgue-density

Jonas Latz (TUM) | Reading Group Uncertainty Quantification (M2) | November 17, 2016 9