IntroRecall: Regularization Approaches
Variational Assimilation
Summer School on Data Assimilation and InverseProblems
Lecture 3: Variational Data Assimilation I: 3dVar and Stability
Roland Potthast
Deutscher Wetterdienst / University of Reading
Reading, UKJuly 22-26, 2013
Lecture 3 Roland Potthast 1
IntroRecall: Regularization Approaches
Variational Assimilation
Summer School Lectures
Lecture 3 Roland Potthast 2
IntroRecall: Regularization Approaches
Variational Assimilation
Contents Lecture 3
IntroSetup of State Space, Model, Measurements
Recall: Regularization ApproachesOperator Inversion Stabilized
Recall: A Minimization Approach
Spectral Inversion Methods
Variational AssimilationVariational Data Assimilation
Linear and Nonlinear 3dVar
Bayes Formula and Data Assimilation
Error Analysis for Cycled Assimilation
Lecture 3 Roland Potthast 3
IntroRecall: Regularization Approaches
Variational AssimilationSetup of State Space, Model, Measurements
Setup I: States and Model Dynamics
We want to calculate the
state ϕ of a dynamical
system. The state space is a
Hilbert space X .
The model is denoted by M,
where
Mk : ϕk 7→ ϕk+1
is mapping the state ϕk at
time tk onto the state ϕk+1
at time tk+1 for
k = 0, 1, 2, ....
Lecture 3 Roland Potthast 4
IntroRecall: Regularization Approaches
Variational AssimilationSetup of State Space, Model, Measurements
Notation
We will write ϕ for the states, usually ϕk for the state at time tk . In discretized
form it is a vector in Rn, i.e. we have
ϕk =
ϕk,1
ϕk,2...
ϕk,n
For the model, M[tk , tξ] is mapping a state at tk onto its evolution from time tkto time tξ . The model has the property that
M[tk , tξ] = Mxi−1 ◦ Mξ−2 ◦ ... ◦ Mk (1)
for k > ξ.
Lecture 3 Roland Potthast 5
IntroRecall: Regularization Approaches
Variational AssimilationSetup of State Space, Model, Measurements
Notation
Recall some notation. For a state ϕk ∈ Rn at time tk we write
‖ϕk‖2 :=n∑
j=1
|ϕk,j |2 (2)
for the Euclidean norm or metric d(ϕ,ψ) = ‖ϕ− ψ‖ and
〈ϕk , ψk〉 :=n∑
j=1
ϕk,jψk,j (3)
for the scalar product. We might write this in a more engineering type notation
and a mathematical notation.
〈ϕ,ψ〉 = ϕTψ = ϕ · ψ = ϕT ◦ ψ. (4)
Lecture 3 Roland Potthast 6
IntroRecall: Regularization Approaches
Variational AssimilationSetup of State Space, Model, Measurements
Setup II: Observation Operator
We assume that the
observations are given in
some observation space Y .
The observation operator
H : X → Y maps states into
observations
H : ϕ 7→ f = H(ϕ),
where H = Hk might depend
on time by the time index k .
H might be nonlinear.
Lecture 3 Roland Potthast 7
IntroRecall: Regularization Approaches
Variational AssimilationSetup of State Space, Model, Measurements
Notation and Setup III
In discretized form the observation fk at time tk is a vector in Rm, i.e. we have
fk =
fk,1fk,2
...
fk,m
Let the observation space Y be a Hilbert space with scalar product 〈·, ·〉.
Measurement Assumption
We assume that our measurements are taking place exactly at the times tk .
Lecture 3 Roland Potthast 8
IntroRecall: Regularization Approaches
Variational AssimilationSetup of State Space, Model, Measurements
Basic Approach
Let H be the operator mapping the state ϕ onto the measurements f . Then we
need to find ϕ by solving the equation
Hϕ = f (5)
Simple Case
Let us assume that H is linear, invertible, and that we are at some point in time t0 where the
above equation needs to be solved. Assume that in initial guess ϕ(b) is given.
With the initial guess ϕ(b), we transform the equation (5) into
H(ϕ− ϕ(b)) = f − H(ϕ(b)) (6)
and update
ϕ = ϕ(b) + H−1(f − H(ϕ(b))). (7)
Lecture 3 Roland Potthast 9
IntroRecall: Regularization Approaches
Variational Assimilation
Operator Inversion StabilizedRecall: A Minimization ApproachSpectral Inversion Methods
Recall Regularization 1: Add regularizing small Identity Matrix
Consider an equation
Hϕ = f (8)
where H−1 is unstable or unbounded.
Hϕ = f
⇒ H∗Hϕ = H∗f
is regularized by
(αI + H∗H)ϕ = H∗f . (9)
Tikhonov Regularization: Replace H−1 by the stable operator
Rα := (αI + H∗H)−1H∗ (10)
with regularization parameter α > 0.Lecture 3 Roland Potthast 10
IntroRecall: Regularization Approaches
Variational Assimilation
Operator Inversion StabilizedRecall: A Minimization ApproachSpectral Inversion Methods
Recall Regularization 2: Least Squares
Tikhonov regularization is equivalent to the minimization of
J(ϕ) :=(α‖ϕ‖2 + ‖Hϕ− f‖2
)(11)
The normal equations are obtained from first order optimality conditions
∇ϕJ =dJ(ϕ)
dϕ
!= 0. (12)
Differentiation leads to
0 = 2αϕ+ 2H∗(Hϕ− f)
⇒ 0 = (αI + H∗H)ϕ− H∗f , (13)
which is our well-known Tikhonov equation
(αI + H∗H)ϕ = H∗f .
Lecture 3 Roland Potthast 11
IntroRecall: Regularization Approaches
Variational Assimilation
Operator Inversion StabilizedRecall: A Minimization ApproachSpectral Inversion Methods
Recall Regularization 3: Spectral Methods
A singular system of an operator H : X → Y written as
(µn, ψn, gn) (14)
is a a set of singular values µn and a pair of orthonormal basis functions ψn, gn
such thatHψn = µngn
H∗gn = µnψn. (15)We have
ϕ =∞∑
n=1
γnψn =⇒ Hϕ =∞∑
n=1
µnγngn. (16)
In the spectral basis the operator H is a multiplication operator!
Lecture 3 Roland Potthast 12
IntroRecall: Regularization Approaches
Variational Assimilation
Operator Inversion StabilizedRecall: A Minimization ApproachSpectral Inversion Methods
Recall Regularization 3: Spectral Methods
In spectral terms we obtain
H∗Hψn = µ2nψn
αIψn = αψn
thus(αI + H∗H)ψn = (α+ µ2
n)ψn, n ∈ N. (17)
Consider
f =∞∑
n=1
βngn ∈ Y . (18)
Tikhonov regularization (αI + H∗H)x = H∗y is equivalent to the spectraldamping scheme
γα,n =µn
α+ µ2n
βn, n ∈ N. (19)
Lecture 3 Roland Potthast 13
IntroRecall: Regularization Approaches
Variational Assimilation
Operator Inversion StabilizedRecall: A Minimization ApproachSpectral Inversion Methods
Regularization 3: Spectral Methods
True Inverse
γn =1
µnβn. (20)
This inversion is unstable, if µn → 0, n→∞!
Tikhonov Inverse (stable if α > 0)
γα,n =µn
α+ µ2n
βn, n ∈ N. (21)
Tikhonov shifts the eigenvalues of H∗H by α.
Lecture 3 Roland Potthast 14
IntroRecall: Regularization Approaches
Variational Assimilation
Variational Data AssimilationLinear and Nonlinear 3dVarBayes Formula and Data AssimilationError Analysis for Cycled Assimilation
Variational Data Assimilation
We now use the regularized inversion to solve the data equation Hϕ = f in
increment form in each step, i.e.
H(ϕ− ϕ(b)) = f − H(ϕ(b)). (22)
Variational Data Assimilation
Let Hk be the operator mapping the state ϕk onto the measurements fk at time
tk , k ∈ N, and let ϕ(b)0 be some given initial state. Further, we are given a
model operator Mk mapping ϕk at time tk onto ϕk+1 at time tk+1. Then, a
variational data assimilation scheme is defined by calculating
ϕ(a)k = ϕ
(b)k + (αI + H∗k Hk)
−1H∗k (fk − Hkϕ(b)k ) (23)
ϕ(b)k+1 = Mkϕ
(a)k for k = 1, 2, 3, .... (24)
Lecture 3 Roland Potthast 15
IntroRecall: Regularization Approaches
Variational Assimilation
Variational Data AssimilationLinear and Nonlinear 3dVarBayes Formula and Data AssimilationError Analysis for Cycled Assimilation
Variational Data Assimilation
We note that by
(αI + H∗H)H∗ = H∗(αI + HH∗)
we have
(αI + H∗H)−1H∗ = H∗(αI + H∗H)−1,
where for the term on the right inversion takes place in them-dimensional space Y , which for n � m might be muchmore efficient than the inversion in X = Rn on the left.
Thus, the variational data assimilation scheme can be written as:
ϕ(a)k = ϕ
(b)k + H∗(αI + Hk H∗k )
−1(fk − Hkϕ(b)k ) (25)
ϕ(b)k+1 = Mkϕ
(a)k for k = 1, 2, 3, .... (26)
Lecture 3 Roland Potthast 16
IntroRecall: Regularization Approaches
Variational Assimilation
Variational Data AssimilationLinear and Nonlinear 3dVarBayes Formula and Data AssimilationError Analysis for Cycled Assimilation
The need for spatial correlations
Working with standard L2-norms in the state space X = Rn leads to crucial
difficulties, reflected by the following example.
Example.Assume that X = Rn, Y = R1 and H = (1, 0, ..., 0). This means that wemeasure the first variable only. The variational scheme calculates the increment
δϕk = ϕ(a)k+1 − ϕ
(b)k = H∗(αI + HH∗)−1(fk − Hϕ
(b)k ).
This means that only the first component is updated. The other components remain unchanged.But it is highly unusual that the first variable will not influence other variables in the same orneighboring points.
• Standard L2-norms do not take correlations into account between
different variables and quantities in spatial neighborhood.
• Standard L2-norms lead to highly unphysical and unrealistic increments.
Lecture 3 Roland Potthast 17
IntroRecall: Regularization Approaches
Variational Assimilation
Variational Data AssimilationLinear and Nonlinear 3dVarBayes Formula and Data AssimilationError Analysis for Cycled Assimilation
Weighted Norms
Given an invertible matrix B ∈ Rn×n define a weighted scalar product with
weight B−1 by
〈ϕ,ψ〉 := 〈ϕ, B−1ψ〉L2 .
for ϕ,ψ ∈ X = Rn. A weighted norm is then obtained by
‖ϕ‖2 := 〈ϕ,ϕ〉.
We also use the notation ‖ · ‖B−1 when we want to indicate the particular
weight B−1 which is used.
Analogously, we employ weighted norms in the observation space Y = Rm
with some invertible matrix R.
Lecture 3 Roland Potthast 18
IntroRecall: Regularization Approaches
Variational Assimilation
Variational Data AssimilationLinear and Nonlinear 3dVarBayes Formula and Data AssimilationError Analysis for Cycled Assimilation
Transformation Formulas for the Adjoints
With the weighted scalar product
〈ϕ, ψ〉 := 〈ϕ, B−1ψ〉L2 , 〈f , g〉 := 〈f ,R−1g〉L2 (27)
the adjoint with respect to the weighted scalar product is denoted by H∗. Then
〈f ,Hϕ〉 = 〈f ,R−1Hϕ〉L2 = 〈R−1f ,Hϕ〉L2 = 〈H′R−1f , ϕ〉L2
= 〈H′R−1f , BB−1ϕ〉 = 〈BH′R−1f , B−1ϕ〉L2 = 〈BH′R−1f , ϕ〉 = 〈H∗f , ϕ〉.
Transformation Formula
H∗ = BH′R−1 (28)
(I + H∗H)−1H∗ = H∗(I + HH∗)−1
= BH′R−1(I + HBH′R−1)−1 = BH′(R + HBH′)−1.
Lecture 3 Roland Potthast 19
IntroRecall: Regularization Approaches
Variational Assimilation
Variational Data AssimilationLinear and Nonlinear 3dVarBayes Formula and Data AssimilationError Analysis for Cycled Assimilation
The variational assimilation of 3dVar
The minimization functional with weighted norms is defined by:
J(ϕ) :=(‖ϕ− ϕ0‖2
B−1 + ‖Hϕ− f‖2R−1
)(29)
Three-dimensional Variational Data Assimilation (3dVar)
The variational update formula of 3dVar is
ϕ(a)k = ϕ
(b)k + (B−1 + H′R−1H)−1H′R−1(fk − H(ϕ
(B)k ))
= ϕ0 + BH′(R + HBH′)−1(fk − Hϕ(b)k ). (30)
ϕ(b)k+1 = Mkϕ
(a)k , k = 1, 2, 3, ... (31)
Lecture 3 Roland Potthast 20
IntroRecall: Regularization Approaches
Variational Assimilation
Variational Data AssimilationLinear and Nonlinear 3dVarBayes Formula and Data AssimilationError Analysis for Cycled Assimilation
Nonlinear Observation Operators
In the case of nonlinear observation operatorsH we employ linearization
H(ϕ(b) + δϕ) = H(ϕ(b)) + Hδϕ+ O(‖δϕ‖2).
with H being the linearization ofH at ϕ(b). Then, we need to minimize
J(δϕ) = ‖δϕ‖2 + ‖(f −H(ϕ(b)))− Hδϕ‖2.
Nonlinear three-dimensional Variational Data Assimilation (3dVar)
The variational update formula of 3dVar is
ϕ(a)k = ϕ
(b)k + (B−1 + H′R−1H)−1H′R−1(fk −H(ϕ(B)
k ))
= ϕ0 + BH′(R + HBH′)−1(fk −Hϕ(b)k ). (32)
ϕ(b)k+1 = Mkϕ
(a)k , k = 1, 2, 3, ... (33)
Lecture 3 Roland Potthast 21
IntroRecall: Regularization Approaches
Variational Assimilation
Variational Data AssimilationLinear and Nonlinear 3dVarBayes Formula and Data AssimilationError Analysis for Cycled Assimilation
Regularization 4: Bayesian Methods
Conditional probabilityP(A|B) := P(A ∩ B)
P(B), (34)
for A, B sets in a probability space. Conditional probability density
p(x|y) := p(x, y)
p(y), (x, y) ∈ X × Y . (35)
From
p(x, y) = p(x|y) · p(y) = p(y|x) · p(x)
we obtain Bayes’ formula
p(x|y) = p(x)p(y|x)p(y)
, x ∈ X , y ∈ Y . (36)
Here p(y) can be considered as a normalization constant!Lecture 3 Roland Potthast 22
IntroRecall: Regularization Approaches
Variational Assimilation
Variational Data AssimilationLinear and Nonlinear 3dVarBayes Formula and Data AssimilationError Analysis for Cycled Assimilation
Regularization 4: Bayesian Methods
Bayes’ Formula for a measurement y and an unknown state x :
p(x|y)︸ ︷︷ ︸posteriorprob.
=1
p(y)︸︷︷︸normalization
p(x)︸︷︷︸priorprob.
p(y|x)︸ ︷︷ ︸measurementprob.
Lecture 3 Roland Potthast 23
IntroRecall: Regularization Approaches
Variational Assimilation
Variational Data AssimilationLinear and Nonlinear 3dVarBayes Formula and Data AssimilationError Analysis for Cycled Assimilation
Regularization 4: Bayesian Methods
Gaussian casep(x) = e−
12
xT B−1x , x ∈ Rn
with prior covariance matrix B,
p(y|x) = e−12(y−Hx)T R−1(y−Hx), y ∈ Y
with measurement covariance matrix R,
leads to the posterior density
p(x|y) = const · e− 1
2
(xT B−1x+(y−Hx)T R−1(y−Hx)
)
Lecture 3 Roland Potthast 24
IntroRecall: Regularization Approaches
Variational Assimilation
Variational Data AssimilationLinear and Nonlinear 3dVarBayes Formula and Data AssimilationError Analysis for Cycled Assimilation
Regularization 4: Bayesian Methods
Maximum Likelyhood Estimator (ML)
ML: ”Find the value x ∈ X for which p(x|y) is maximal”
Maximizing
e− 1
2
(xT B−1x+(y−Hx)T R−1(y−Hx)
)is equivalent to minimizing
J(x) = xT B−1x + (y − Hx)T R−1(y − Hx)
which for B = αI and R = I is given by
J(x) = α‖x‖2 + ‖Hx − y‖2.
The minimum ist calculated by the Tikhonov operator.
Lecture 3 Roland Potthast 25
IntroRecall: Regularization Approaches
Variational Assimilation
Variational Data AssimilationLinear and Nonlinear 3dVarBayes Formula and Data AssimilationError Analysis for Cycled Assimilation
Bayes Data Assimilation: Uncertainty Quantification
• Bayes Data Assimilation employs distributions of the background
state, of the errors and of the analysis state.
• Then, the full analysis distribution needs to be propagated to the next
assimilation time.
• If we only propagate the mean of the analysis, we are in the range of
classical variational data assimilation methodology.
Lecture 3 Roland Potthast 26
IntroRecall: Regularization Approaches
Variational Assimilation
Variational Data AssimilationLinear and Nonlinear 3dVarBayes Formula and Data AssimilationError Analysis for Cycled Assimilation
Error Types in DA Algorithms
ϕ(b)k+1 := Mϕ
(a)k , k = 0, 1, 2, ... (37)
Update formula
ϕ(a)k+1 = ϕ
(b)k+1 + Rα
(fk+1 − Hϕ
(b)k+1
)(38)
with
Rα = (αI + H∗H)−1H∗ (39)
• Error in the measurement data fk+1
• Error in the observation operator H
• Error in the model dynamics M
• Error by the reconstruction operator Rα 6= H−1
• cumulated errors from previous iterations/cyclingLecture 3 Roland Potthast 27
IntroRecall: Regularization Approaches
Variational Assimilation
Variational Data AssimilationLinear and Nonlinear 3dVarBayes Formula and Data AssimilationError Analysis for Cycled Assimilation
Full deterministic error dynamics I
ϕ(a)k+1 = ϕ
(b)k+1 + Rα
(fk+1 − Hϕ
(b)k+1
)=: ek+1︷ ︸︸ ︷
ϕ(a)k+1 − ϕ
(true)k+1 = ϕ
(b)k+1 − ϕ
(true)k+1 + Rα
(fk+1−f
(true)k+1
)+ Rα
(f(true)k+1 − Hϕ
(b)k+1
)= Mϕ
(a)k − M(true)ϕ
(true)k + Rα
(f(δ)k+1
)+Rα
(H(true)ϕ
(true)k+1 − Hϕ
(b)k+1
)= M
(ϕ(a)k − ϕ
(true)k
)+(
M − M(true))ϕ(true)k + Rα
(f(δ)k+1
)+Rα
((H(true) − H)ϕ
(true)k+1 + H
(ϕ(true)k+1 − ϕ
(b)k+1
))Lecture 3 Roland Potthast 28
IntroRecall: Regularization Approaches
Variational Assimilation
Variational Data AssimilationLinear and Nonlinear 3dVarBayes Formula and Data AssimilationError Analysis for Cycled Assimilation
Full deterministic error dynamics II
Update formula
ϕ(a)k+1 = ϕ
(b)k+1 + Rα
(fk+1 − Hϕ
(b)k+1
)Error contributions:
ek+1 =
reconstruction error︷ ︸︸ ︷(I − RαH)
propagation of previous error︷ ︸︸ ︷{Mek +
(M − M(true)
)ϕ(true)k
}
+
data error influence︷ ︸︸ ︷Rα
(f(δ)k+1
)
+
observation operator error︷ ︸︸ ︷Rα
((H(true) − H)ϕ
(true)k+1
)Lecture 3 Roland Potthast 29
IntroRecall: Regularization Approaches
Variational Assimilation
Variational Data AssimilationLinear and Nonlinear 3dVarBayes Formula and Data AssimilationError Analysis for Cycled Assimilation
A system with constant dynamics
As a simple model system for study we use constant dynamics M = Identity ,
i.e
ϕ(b)k+1 = ϕ
(a)k , k = 1, 2, 3, ... (40)
for 3dVar. Also, we employ identical measurements fk ≡ f , k ∈ N.
Then, 3dVar is given by the iteration
ϕk = ϕk−1 + (αI + H′H)−1H′(f − Hϕk−1), k = 1, 2, 3, ... (41)
(This coincides with work of Engl on ’iterated Tikhonov regularization’!)
For the spectral coefficients γn,k of ϕk this leads to the iteration
γn,k = γn,k−1 +µn
α+ µ2n
(fn,k − µnγn,k−1) (42)
Lecture 3 Roland Potthast 30
IntroRecall: Regularization Approaches
Variational Assimilation
Variational Data AssimilationLinear and Nonlinear 3dVarBayes Formula and Data AssimilationError Analysis for Cycled Assimilation
Spectral Formula I
We employ f = Hϕ(true) + δ, fn,k = µnγ(true)n + δn. (43)
and obtain
γn,k = γn,k−1 +µ2
n
α+ µ2n
(γ(true)n − γn,k−1) +
µ2n
α+ µ2n
δn
µn
Theorem (Spectral Formula I)
The 3dVar cycling for a constant dynamics with identical measurements
f = Hϕ(true) + δ lead to the spectral update formula
γn,k = (1− qn)γ(true)n + qnγn,k−1 +
(1− qn)
µnδn (44)
using qn =α
α+ µ2n
= 1− µ2n
α+ µ2n
. (45)
Lecture 3 Roland Potthast 31
IntroRecall: Regularization Approaches
Variational Assimilation
Variational Data AssimilationLinear and Nonlinear 3dVarBayes Formula and Data AssimilationError Analysis for Cycled Assimilation
Spectral Formula II
Theorem (Spectral Formula II)
The 3dVar cycling for a constant dynamics with identical measurements
f = Hϕ(true) + δ can be carried out explicitly. The development of its spectral
coefficients is given by
γn,k = (1− qkn)γ
(true)n + qk
nγn,0 +(1− qk
n)
µnδn (46)
using
qn =α
α+ µ2n
= 1− µ2n
α+ µ2n
. (47)
Proof. Induction over k. �
Lecture 3 Roland Potthast 32
IntroRecall: Regularization Approaches
Variational Assimilation
Variational Data AssimilationLinear and Nonlinear 3dVarBayes Formula and Data AssimilationError Analysis for Cycled Assimilation
Convergence for f ∈ R(H)
Theorem (Convergence for f ∈ R(H))
Cycled 3dVar for a constant dynamics and identical measurements
f (true) + δ ∈ R(H) tends to the true solution ϕ(true) + σ with Hσ = δ for
k →∞.
Proof. We study
γn,k = (1− qkn)γ
(true)n + qk
nγn,0 +(1− qk
n)
µnδn (48)
for k →∞. Since 0 < qn < 1, we have
qkn → 0, k →∞, (1− qk
n)→ 1, k →∞. (49)
Since δ = Hσ the element σ with spectral coefficients δn/µn is in X and
cycled 3dVar converges towards ϕ(true) + σ. �
Lecture 3 Roland Potthast 33
IntroRecall: Regularization Approaches
Variational Assimilation
Variational Data AssimilationLinear and Nonlinear 3dVarBayes Formula and Data AssimilationError Analysis for Cycled Assimilation
Divergence for f 6∈ R(H)
Theorem (Divergence for f 6∈ R(H))
For a constant dynamics and identical measurements f (true) + δ 6∈ R(H)cycled 3dVar diverges for k →∞.
Proof. We study
γn,k = (1− qkn)γ
(true)n + qk
nγn,0 +(1− qk
n)
µnδn (50)
for k →∞. Let σk ∈ X denote the element with spectral coefficients
σn,k =(1− qk
n)
µnδn, k, n ∈ N.
which is well defined since for every fixed k ∈ N∣∣∣(1− qkn)
µn
∣∣∣ = ∣∣∣(α+ µ2n)
k − αk
(α+ µ2n)
kµn
∣∣∣ (51)
is bounded uniformly for n ∈ N.Lecture 3 Roland Potthast 34
IntroRecall: Regularization Approaches
Variational Assimilation
Variational Data AssimilationLinear and Nonlinear 3dVarBayes Formula and Data AssimilationError Analysis for Cycled Assimilation
Divergence for f 6∈ R(H)
Since δ 6∈ R(H) we know that
SL :=L∑
n=1
∣∣∣ δn
µn
∣∣∣2 →∞, L→∞. (52)
Given C > 0 we can choose L such that SL > 2C. Then
‖σk‖2 ≥L∑
n=1
∣∣∣(1− qkn)δn
µn
∣∣∣2 > C (53)
for k ∈ N sufficiently large, which proves
‖σk‖ → ∞, k →∞ (54)
and the proof is complete. �Lecture 3 Roland Potthast 35
IntroRecall: Regularization Approaches
Variational Assimilation
Variational Data AssimilationLinear and Nonlinear 3dVarBayes Formula and Data AssimilationError Analysis for Cycled Assimilation
Numerical Example: Dynamic Magnetic Tomography
Lecture 3 Roland Potthast 36
IntroRecall: Regularization Approaches
Variational Assimilation
Variational Data AssimilationLinear and Nonlinear 3dVarBayes Formula and Data AssimilationError Analysis for Cycled Assimilation
Literature
Marx and P.: On Instabilities for Data Assimilation Algorithms, GEM -
International Journal on Geomathematics November 2012, Volume 3,
Issue 2, pp 253-278
Moodey, Lawless, P. and van Leeuwen: Nonlinear error dynamics for
cycled data assimilation methods 2013 Inverse Problems 29 025002
Freitag and P.: Synergy of Inverse Problems and Data Assimilation
Techniques, in Press. in ”Large Scale Inverse Problems - Computational
Methods and Applications in the Earth Sciences” Hrsg. v. Cullen, Mike /
Freitag, Melina A / Kindermann, Stefan / Scheichl, Robert
http://www.degruyter.com/view/product/182025
Marx and P.: Data Assimilation Algorithms for Dynamic Magnetic
Tomography and Parameter Reconstruction, Preprint.
Lecture 3 Roland Potthast 37
IntroRecall: Regularization Approaches
Variational Assimilation
Variational Data AssimilationLinear and Nonlinear 3dVarBayes Formula and Data AssimilationError Analysis for Cycled Assimilation
Summer School Lectures
Lecture 3 Roland Potthast 38
Top Related