Multilevel stochastic collocations with dimensionality reduction - … · 2017. 1. 27. · TUM,...
Transcript of Multilevel stochastic collocations with dimensionality reduction - … · 2017. 1. 27. · TUM,...
![Page 1: Multilevel stochastic collocations with dimensionality reduction - … · 2017. 1. 27. · TUM, Chair of Scientific Computing in Computer Science (I5) 27.01.2017. Outline 1 Motivation](https://reader036.fdocuments.net/reader036/viewer/2022071107/5fe2199c08b91c3b94210435/html5/thumbnails/1.jpg)
Multilevel stochastic collocations withdimensionality reduction
Ionut Farcas
TUM, Chair of Scientific Computing in Computer Science (I5)
27.01.2017
![Page 2: Multilevel stochastic collocations with dimensionality reduction - … · 2017. 1. 27. · TUM, Chair of Scientific Computing in Computer Science (I5) 27.01.2017. Outline 1 Motivation](https://reader036.fdocuments.net/reader036/viewer/2022071107/5fe2199c08b91c3b94210435/html5/thumbnails/2.jpg)
Outline
1 Motivation
2 Theoretical backgroundUncertainty modelingSparse gridsGeneralized polynomial chaos and sparse gridsMultilevel collocation methodsStochastic dimensionality reduction
3 Test scenario
4 Discussion
![Page 3: Multilevel stochastic collocations with dimensionality reduction - … · 2017. 1. 27. · TUM, Chair of Scientific Computing in Computer Science (I5) 27.01.2017. Outline 1 Motivation](https://reader036.fdocuments.net/reader036/viewer/2022071107/5fe2199c08b91c3b94210435/html5/thumbnails/3.jpg)
Motivation
problem: quantification of uncertainty in complex phenomenamultiphysics (e.g. fluid-structure interaction)plasma physics...
main challenge: “curse of dimensionality”→ “curse of resources”solution 1.1: delay the “curse of dimensionality”→ sparse gridssolution 1.2: try reducing the dimensionality→ sensitivity analysis
![Page 4: Multilevel stochastic collocations with dimensionality reduction - … · 2017. 1. 27. · TUM, Chair of Scientific Computing in Computer Science (I5) 27.01.2017. Outline 1 Motivation](https://reader036.fdocuments.net/reader036/viewer/2022071107/5fe2199c08b91c3b94210435/html5/thumbnails/4.jpg)
Motivation
problem: quantification of uncertainty in complex phenomenamultiphysics (e.g. fluid-structure interaction)plasma physics...
main challenge: “curse of dimensionality”→ “curse of resources”
solution 1.1: delay the “curse of dimensionality”→ sparse gridssolution 1.2: try reducing the dimensionality→ sensitivity analysis
![Page 5: Multilevel stochastic collocations with dimensionality reduction - … · 2017. 1. 27. · TUM, Chair of Scientific Computing in Computer Science (I5) 27.01.2017. Outline 1 Motivation](https://reader036.fdocuments.net/reader036/viewer/2022071107/5fe2199c08b91c3b94210435/html5/thumbnails/5.jpg)
Motivation
problem: quantification of uncertainty in complex phenomenamultiphysics (e.g. fluid-structure interaction)plasma physics...
main challenge: “curse of dimensionality”→ “curse of resources”solution 1.1: delay the “curse of dimensionality”→ sparse grids
solution 1.2: try reducing the dimensionality→ sensitivity analysis
![Page 6: Multilevel stochastic collocations with dimensionality reduction - … · 2017. 1. 27. · TUM, Chair of Scientific Computing in Computer Science (I5) 27.01.2017. Outline 1 Motivation](https://reader036.fdocuments.net/reader036/viewer/2022071107/5fe2199c08b91c3b94210435/html5/thumbnails/6.jpg)
Motivation
problem: quantification of uncertainty in complex phenomenamultiphysics (e.g. fluid-structure interaction)plasma physics...
main challenge: “curse of dimensionality”→ “curse of resources”solution 1.1: delay the “curse of dimensionality”→ sparse gridssolution 1.2: try reducing the dimensionality→ sensitivity analysis
![Page 7: Multilevel stochastic collocations with dimensionality reduction - … · 2017. 1. 27. · TUM, Chair of Scientific Computing in Computer Science (I5) 27.01.2017. Outline 1 Motivation](https://reader036.fdocuments.net/reader036/viewer/2022071107/5fe2199c08b91c3b94210435/html5/thumbnails/7.jpg)
Uncertainty modeling
probabilistic modelingprobability space (Ω,F ,P)
θ = (θ1, θ2, . . . , θd ) vector of continuous i.i.d. random variablessupp(θi) = Γi , supp(θ) = Γ1 × Γ2 × . . .× Γd = Γ
![Page 8: Multilevel stochastic collocations with dimensionality reduction - … · 2017. 1. 27. · TUM, Chair of Scientific Computing in Computer Science (I5) 27.01.2017. Outline 1 Motivation](https://reader036.fdocuments.net/reader036/viewer/2022071107/5fe2199c08b91c3b94210435/html5/thumbnails/8.jpg)
Generalized polynomial chaos approximation
idea: represent an arbitrary random variable (of interest) as afunction of another random variable with given distribution
how: use a series expansion of orthogonal polynomialslet p = (p1, . . . ,pd ) ∈ Nd :
∑di=1 pi < P
consider d-variate orthogonal polynomials
Φp(θ) := Φp1(θ1) . . .Φpd (θd )
for simplicity, drop the multi-index subscript p and use instead ascalar index n = 1, . . . ,N, N =
(d+Pd
)orthogonality means
E[Φn(θ)Φm(θ)] =
∫ΓΦn(θ)Φm(θ)ρ(θ)dθ = γnδnm, γn ∈ R
![Page 9: Multilevel stochastic collocations with dimensionality reduction - … · 2017. 1. 27. · TUM, Chair of Scientific Computing in Computer Science (I5) 27.01.2017. Outline 1 Motivation](https://reader036.fdocuments.net/reader036/viewer/2022071107/5fe2199c08b91c3b94210435/html5/thumbnails/9.jpg)
Generalized polynomial chaos approximation
idea: represent an arbitrary random variable (of interest) as afunction of another random variable with given distributionhow: use a series expansion of orthogonal polynomials
let p = (p1, . . . ,pd ) ∈ Nd :∑d
i=1 pi < Pconsider d-variate orthogonal polynomials
Φp(θ) := Φp1(θ1) . . .Φpd (θd )
for simplicity, drop the multi-index subscript p and use instead ascalar index n = 1, . . . ,N, N =
(d+Pd
)orthogonality means
E[Φn(θ)Φm(θ)] =
∫ΓΦn(θ)Φm(θ)ρ(θ)dθ = γnδnm, γn ∈ R
![Page 10: Multilevel stochastic collocations with dimensionality reduction - … · 2017. 1. 27. · TUM, Chair of Scientific Computing in Computer Science (I5) 27.01.2017. Outline 1 Motivation](https://reader036.fdocuments.net/reader036/viewer/2022071107/5fe2199c08b91c3b94210435/html5/thumbnails/10.jpg)
Generalized polynomial chaos approximation
idea: represent an arbitrary random variable (of interest) as afunction of another random variable with given distributionhow: use a series expansion of orthogonal polynomialslet p = (p1, . . . ,pd ) ∈ Nd :
∑di=1 pi < P
consider d-variate orthogonal polynomials
Φp(θ) := Φp1(θ1) . . .Φpd (θd )
for simplicity, drop the multi-index subscript p and use instead ascalar index n = 1, . . . ,N, N =
(d+Pd
)orthogonality means
E[Φn(θ)Φm(θ)] =
∫ΓΦn(θ)Φm(θ)ρ(θ)dθ = γnδnm, γn ∈ R
![Page 11: Multilevel stochastic collocations with dimensionality reduction - … · 2017. 1. 27. · TUM, Chair of Scientific Computing in Computer Science (I5) 27.01.2017. Outline 1 Motivation](https://reader036.fdocuments.net/reader036/viewer/2022071107/5fe2199c08b91c3b94210435/html5/thumbnails/11.jpg)
Generalized polynomial chaos approximation
idea: represent an arbitrary random variable (of interest) as afunction of another random variable with given distributionhow: use a series expansion of orthogonal polynomialslet p = (p1, . . . ,pd ) ∈ Nd :
∑di=1 pi < P
consider d-variate orthogonal polynomials
Φp(θ) := Φp1(θ1) . . .Φpd (θd )
for simplicity, drop the multi-index subscript p and use instead ascalar index n = 1, . . . ,N, N =
(d+Pd
)orthogonality means
E[Φn(θ)Φm(θ)] =
∫ΓΦn(θ)Φm(θ)ρ(θ)dθ = γnδnm, γn ∈ R
![Page 12: Multilevel stochastic collocations with dimensionality reduction - … · 2017. 1. 27. · TUM, Chair of Scientific Computing in Computer Science (I5) 27.01.2017. Outline 1 Motivation](https://reader036.fdocuments.net/reader036/viewer/2022071107/5fe2199c08b91c3b94210435/html5/thumbnails/12.jpg)
Generalized polynomial chaos approximation
idea: represent an arbitrary random variable (of interest) as afunction of another random variable with given distributionhow: use a series expansion of orthogonal polynomialslet p = (p1, . . . ,pd ) ∈ Nd :
∑di=1 pi < P
consider d-variate orthogonal polynomials
Φp(θ) := Φp1(θ1) . . .Φpd (θd )
for simplicity, drop the multi-index subscript p and use instead ascalar index n = 1, . . . ,N, N =
(d+Pd
)orthogonality means
E[Φn(θ)Φm(θ)] =
∫ΓΦn(θ)Φm(θ)ρ(θ)dθ = γnδnm, γn ∈ R
![Page 13: Multilevel stochastic collocations with dimensionality reduction - … · 2017. 1. 27. · TUM, Chair of Scientific Computing in Computer Science (I5) 27.01.2017. Outline 1 Motivation](https://reader036.fdocuments.net/reader036/viewer/2022071107/5fe2199c08b91c3b94210435/html5/thumbnails/13.jpg)
Generalized polynomial chaos
let x - deterministic inputs, θ - stochastic inputs, f - modelthe gPC approximation of order N reads
f (x ,θ) ≈ fN(x ,θ) =N−1∑n=0
cn(x)Φn(θ)
gPC coefficients via projection
cn(x) =
∫Γ
f (x ,θ)Φn(θ)ρ(θ)dθ = E[f (x ,θ)Φn(θ)]
![Page 14: Multilevel stochastic collocations with dimensionality reduction - … · 2017. 1. 27. · TUM, Chair of Scientific Computing in Computer Science (I5) 27.01.2017. Outline 1 Motivation](https://reader036.fdocuments.net/reader036/viewer/2022071107/5fe2199c08b91c3b94210435/html5/thumbnails/14.jpg)
Post-processing
expectationE[f (x ,θ)] = c0(x),
variance
Var [f (x ,θ)] =N−1∑n=1
c2n(x).
total Sobol’ indices
STi (x) =
Varp[f (x ,θ)]
Var [f (x ,θ)]=
∑k∈Ap
c2k (x)
Var [f (x ,θ)],
Ap = p ∈ Nd : pi ∈ p,pi 6= 0d∑
i=1
STi (x) = 1
![Page 15: Multilevel stochastic collocations with dimensionality reduction - … · 2017. 1. 27. · TUM, Chair of Scientific Computing in Computer Science (I5) 27.01.2017. Outline 1 Motivation](https://reader036.fdocuments.net/reader036/viewer/2022071107/5fe2199c08b91c3b94210435/html5/thumbnails/15.jpg)
Sparse grid idea
problem: discretize efficiently a tensor product space
standard approach: full grid→ O(Nd ) dof, if N dof in one direction→ “curse of dimensionality”idea: delay the curse of dimensionalityuse sparse grids: weaken the assumed coupling between theinput dimensionsO(Nd )→ O(N(logN)d−1) dof
![Page 16: Multilevel stochastic collocations with dimensionality reduction - … · 2017. 1. 27. · TUM, Chair of Scientific Computing in Computer Science (I5) 27.01.2017. Outline 1 Motivation](https://reader036.fdocuments.net/reader036/viewer/2022071107/5fe2199c08b91c3b94210435/html5/thumbnails/16.jpg)
Sparse grid idea
problem: discretize efficiently a tensor product spacestandard approach: full grid→ O(Nd ) dof, if N dof in one direction→ “curse of dimensionality”
idea: delay the curse of dimensionalityuse sparse grids: weaken the assumed coupling between theinput dimensionsO(Nd )→ O(N(logN)d−1) dof
![Page 17: Multilevel stochastic collocations with dimensionality reduction - … · 2017. 1. 27. · TUM, Chair of Scientific Computing in Computer Science (I5) 27.01.2017. Outline 1 Motivation](https://reader036.fdocuments.net/reader036/viewer/2022071107/5fe2199c08b91c3b94210435/html5/thumbnails/17.jpg)
Sparse grid idea
problem: discretize efficiently a tensor product spacestandard approach: full grid→ O(Nd ) dof, if N dof in one direction→ “curse of dimensionality”idea: delay the curse of dimensionalityuse sparse grids: weaken the assumed coupling between theinput dimensionsO(Nd )→ O(N(logN)d−1) dof
![Page 18: Multilevel stochastic collocations with dimensionality reduction - … · 2017. 1. 27. · TUM, Chair of Scientific Computing in Computer Science (I5) 27.01.2017. Outline 1 Motivation](https://reader036.fdocuments.net/reader036/viewer/2022071107/5fe2199c08b91c3b94210435/html5/thumbnails/18.jpg)
Sparse grid idea
⇒
![Page 19: Multilevel stochastic collocations with dimensionality reduction - … · 2017. 1. 27. · TUM, Chair of Scientific Computing in Computer Science (I5) 27.01.2017. Outline 1 Motivation](https://reader036.fdocuments.net/reader036/viewer/2022071107/5fe2199c08b91c3b94210435/html5/thumbnails/19.jpg)
Hierarchical sparse grids ingredients
grid level l = (l1, . . . , ld ) ∈ Nd
spatial position i = (i1, . . . , id ) ∈ Nd
generic grid point ul,i = (ul1,i1 , . . . ,uld ,id )
equidistant grid with mesh size hli = 2−li , i = 1, . . . ,dbasis functions ϕl,i with support [ul,i − hl ,ul,i + hl ]
ϕl,i(u) = ϕ(u − ihl
hl
)in d-dimensions,
ϕl,i (u) =d∏
j=1
ϕlj ,ij (uj )
![Page 20: Multilevel stochastic collocations with dimensionality reduction - … · 2017. 1. 27. · TUM, Chair of Scientific Computing in Computer Science (I5) 27.01.2017. Outline 1 Motivation](https://reader036.fdocuments.net/reader036/viewer/2022071107/5fe2199c08b91c3b94210435/html5/thumbnails/20.jpg)
Hierarchical sparse grids preliminaries
Hl = spanϕl,i : 1 ≤ i ≤2l − 1 - nodal setWl = spanϕl,i : i ∈ Il -hierarchical increment set
Il = i ∈ Nd : 1 ≤ ik ≤ 2lk − 1, ik odd , k = 1 . . . d
Hl =⊗
k≤l Wk
![Page 21: Multilevel stochastic collocations with dimensionality reduction - … · 2017. 1. 27. · TUM, Chair of Scientific Computing in Computer Science (I5) 27.01.2017. Outline 1 Motivation](https://reader036.fdocuments.net/reader036/viewer/2022071107/5fe2199c08b91c3b94210435/html5/thumbnails/21.jpg)
Hierarchical sparse grids preliminaries
given the hierarchical increment spaces Wl and given a level L,we can create further spaces VL
VL =⊗
k∈J Wk , for some multiindex set J
if J = l ∈ Nd : |l |∞ ≤ L - full grid spaceif J = l ∈ Nd : |l |1 ≤ L + d − 1 - standard sparse grid space
![Page 22: Multilevel stochastic collocations with dimensionality reduction - … · 2017. 1. 27. · TUM, Chair of Scientific Computing in Computer Science (I5) 27.01.2017. Outline 1 Motivation](https://reader036.fdocuments.net/reader036/viewer/2022071107/5fe2199c08b91c3b94210435/html5/thumbnails/22.jpg)
Hierarchical sparse grids preliminaries
given the hierarchical increment spaces Wl and given a level L,we can create further spaces VL
VL =⊗
k∈J Wk , for some multiindex set Jif J = l ∈ Nd : |l |∞ ≤ L - full grid space
if J = l ∈ Nd : |l |1 ≤ L + d − 1 - standard sparse grid space
![Page 23: Multilevel stochastic collocations with dimensionality reduction - … · 2017. 1. 27. · TUM, Chair of Scientific Computing in Computer Science (I5) 27.01.2017. Outline 1 Motivation](https://reader036.fdocuments.net/reader036/viewer/2022071107/5fe2199c08b91c3b94210435/html5/thumbnails/23.jpg)
Hierarchical sparse grids preliminaries
given the hierarchical increment spaces Wl and given a level L,we can create further spaces VL
VL =⊗
k∈J Wk , for some multiindex set Jif J = l ∈ Nd : |l |∞ ≤ L - full grid spaceif J = l ∈ Nd : |l |1 ≤ L + d − 1 - standard sparse grid space
![Page 24: Multilevel stochastic collocations with dimensionality reduction - … · 2017. 1. 27. · TUM, Chair of Scientific Computing in Computer Science (I5) 27.01.2017. Outline 1 Motivation](https://reader036.fdocuments.net/reader036/viewer/2022071107/5fe2199c08b91c3b94210435/html5/thumbnails/24.jpg)
Hierarchical sparse grids example
L = 5
![Page 25: Multilevel stochastic collocations with dimensionality reduction - … · 2017. 1. 27. · TUM, Chair of Scientific Computing in Computer Science (I5) 27.01.2017. Outline 1 Motivation](https://reader036.fdocuments.net/reader036/viewer/2022071107/5fe2199c08b91c3b94210435/html5/thumbnails/25.jpg)
Interpolation on hierarchical sparse grids
consider g : [0,1]d → Rthe sparse grid interpolant gI(u) of g(u) is
gI(u) =∑
l∈J ,i∈Il
αl,iϕl,i(u) (1)
αl,i are the so-called hierarchical surplusesassumeg ∈ Hmix
2 ([0,1]d ) = f : [0,1]d → R : Dl f ∈ L2([0,1]d ), |l |∞ ≤ 2,Dl f = ∂|l|1 f/∂x l1
1 . . . ∂x ldd
if full grid||g(u)− gI(u)||L2 ∈ O
(h2
L)
if sparse grid||g(u)− gI(u)||L2 ∈ O
(h2
LLd−1)
![Page 26: Multilevel stochastic collocations with dimensionality reduction - … · 2017. 1. 27. · TUM, Chair of Scientific Computing in Computer Science (I5) 27.01.2017. Outline 1 Motivation](https://reader036.fdocuments.net/reader036/viewer/2022071107/5fe2199c08b91c3b94210435/html5/thumbnails/26.jpg)
Piecewise linear basis functions
![Page 27: Multilevel stochastic collocations with dimensionality reduction - … · 2017. 1. 27. · TUM, Chair of Scientific Computing in Computer Science (I5) 27.01.2017. Outline 1 Motivation](https://reader036.fdocuments.net/reader036/viewer/2022071107/5fe2199c08b91c3b94210435/html5/thumbnails/27.jpg)
Piecewise polynomial basis functions
![Page 28: Multilevel stochastic collocations with dimensionality reduction - … · 2017. 1. 27. · TUM, Chair of Scientific Computing in Computer Science (I5) 27.01.2017. Outline 1 Motivation](https://reader036.fdocuments.net/reader036/viewer/2022071107/5fe2199c08b91c3b94210435/html5/thumbnails/28.jpg)
Spatial refinement
due to the hierarchical construction→ local refinement possibleαl,i - good measure of the interpolation error
the absolute value of αl,i - good refinement metricselect the grid points with the largest surpluses valuesadd their hierarchical descendants to Jif not all hierarchical parents exist add them
multiple grid points can be refined in one step
![Page 29: Multilevel stochastic collocations with dimensionality reduction - … · 2017. 1. 27. · TUM, Chair of Scientific Computing in Computer Science (I5) 27.01.2017. Outline 1 Motivation](https://reader036.fdocuments.net/reader036/viewer/2022071107/5fe2199c08b91c3b94210435/html5/thumbnails/29.jpg)
Spatial refinement
due to the hierarchical construction→ local refinement possibleαl,i - good measure of the interpolation errorthe absolute value of αl,i - good refinement metric
select the grid points with the largest surpluses valuesadd their hierarchical descendants to Jif not all hierarchical parents exist add them
multiple grid points can be refined in one step
![Page 30: Multilevel stochastic collocations with dimensionality reduction - … · 2017. 1. 27. · TUM, Chair of Scientific Computing in Computer Science (I5) 27.01.2017. Outline 1 Motivation](https://reader036.fdocuments.net/reader036/viewer/2022071107/5fe2199c08b91c3b94210435/html5/thumbnails/30.jpg)
Spatial refinement
due to the hierarchical construction→ local refinement possibleαl,i - good measure of the interpolation errorthe absolute value of αl,i - good refinement metric
select the grid points with the largest surpluses valuesadd their hierarchical descendants to Jif not all hierarchical parents exist add them
multiple grid points can be refined in one step
![Page 31: Multilevel stochastic collocations with dimensionality reduction - … · 2017. 1. 27. · TUM, Chair of Scientific Computing in Computer Science (I5) 27.01.2017. Outline 1 Motivation](https://reader036.fdocuments.net/reader036/viewer/2022071107/5fe2199c08b91c3b94210435/html5/thumbnails/31.jpg)
Spatial refinement: Franke’s function
f (x1, x2) = 0.75 exp(− (9x1 − 2)2
4− (9x2 − 2)2
4
)+
0.75 exp(− (9x1 + 1)2
49− 9x2 + 2
10
)+
0.5 exp(− (9x1 − 7)2
4− (9x2 − 3)2
4
)−
0.2 exp(− (9x1 − 4)2 − (9x2 − 7)
)
![Page 32: Multilevel stochastic collocations with dimensionality reduction - … · 2017. 1. 27. · TUM, Chair of Scientific Computing in Computer Science (I5) 27.01.2017. Outline 1 Motivation](https://reader036.fdocuments.net/reader036/viewer/2022071107/5fe2199c08b91c3b94210435/html5/thumbnails/32.jpg)
Franke’s function
![Page 33: Multilevel stochastic collocations with dimensionality reduction - … · 2017. 1. 27. · TUM, Chair of Scientific Computing in Computer Science (I5) 27.01.2017. Outline 1 Motivation](https://reader036.fdocuments.net/reader036/viewer/2022071107/5fe2199c08b91c3b94210435/html5/thumbnails/33.jpg)
Franke’s function refinement part 1L = 5refine 20% of the grid points
![Page 34: Multilevel stochastic collocations with dimensionality reduction - … · 2017. 1. 27. · TUM, Chair of Scientific Computing in Computer Science (I5) 27.01.2017. Outline 1 Motivation](https://reader036.fdocuments.net/reader036/viewer/2022071107/5fe2199c08b91c3b94210435/html5/thumbnails/34.jpg)
Franke’s function refinement part 2L = 6refine 20% of the grid points
![Page 35: Multilevel stochastic collocations with dimensionality reduction - … · 2017. 1. 27. · TUM, Chair of Scientific Computing in Computer Science (I5) 27.01.2017. Outline 1 Motivation](https://reader036.fdocuments.net/reader036/viewer/2022071107/5fe2199c08b91c3b94210435/html5/thumbnails/35.jpg)
gPC coefficients computation
remembercn(x) =
∫Γ
f (x ,θ)Φn(θ)ρ(θ)dθ
how can we use sparse grids?let T : [0,1]d → Γ
then,
cn(x) =
∫[0,1]d
f (x ,T (u))Φn(T (u))|detJT (u)|ρ(T (u))du
intuitionΦn(T (u)) - tensor structureif f (x ,T (u)) would also have a tensor structure ...
![Page 36: Multilevel stochastic collocations with dimensionality reduction - … · 2017. 1. 27. · TUM, Chair of Scientific Computing in Computer Science (I5) 27.01.2017. Outline 1 Motivation](https://reader036.fdocuments.net/reader036/viewer/2022071107/5fe2199c08b91c3b94210435/html5/thumbnails/36.jpg)
gPC coefficients computation
remembercn(x) =
∫Γ
f (x ,θ)Φn(θ)ρ(θ)dθ
how can we use sparse grids?
let T : [0,1]d → Γ
then,
cn(x) =
∫[0,1]d
f (x ,T (u))Φn(T (u))|detJT (u)|ρ(T (u))du
intuitionΦn(T (u)) - tensor structureif f (x ,T (u)) would also have a tensor structure ...
![Page 37: Multilevel stochastic collocations with dimensionality reduction - … · 2017. 1. 27. · TUM, Chair of Scientific Computing in Computer Science (I5) 27.01.2017. Outline 1 Motivation](https://reader036.fdocuments.net/reader036/viewer/2022071107/5fe2199c08b91c3b94210435/html5/thumbnails/37.jpg)
gPC coefficients computation
remembercn(x) =
∫Γ
f (x ,θ)Φn(θ)ρ(θ)dθ
how can we use sparse grids?let T : [0,1]d → Γ
then,
cn(x) =
∫[0,1]d
f (x ,T (u))Φn(T (u))|detJT (u)|ρ(T (u))du
intuitionΦn(T (u)) - tensor structureif f (x ,T (u)) would also have a tensor structure ...
![Page 38: Multilevel stochastic collocations with dimensionality reduction - … · 2017. 1. 27. · TUM, Chair of Scientific Computing in Computer Science (I5) 27.01.2017. Outline 1 Motivation](https://reader036.fdocuments.net/reader036/viewer/2022071107/5fe2199c08b91c3b94210435/html5/thumbnails/38.jpg)
gPC coefficients computation
remembercn(x) =
∫Γ
f (x ,θ)Φn(θ)ρ(θ)dθ
how can we use sparse grids?let T : [0,1]d → Γ
then,
cn(x) =
∫[0,1]d
f (x ,T (u))Φn(T (u))|detJT (u)|ρ(T (u))du
intuitionΦn(T (u)) - tensor structureif f (x ,T (u)) would also have a tensor structure ...
![Page 39: Multilevel stochastic collocations with dimensionality reduction - … · 2017. 1. 27. · TUM, Chair of Scientific Computing in Computer Science (I5) 27.01.2017. Outline 1 Motivation](https://reader036.fdocuments.net/reader036/viewer/2022071107/5fe2199c08b91c3b94210435/html5/thumbnails/39.jpg)
gPC coefficients computation
iff (x ,T (u)) ≈ f (x ,T (u)) =
∑l∈J ,i∈Il
αl,iϕl,i (u)
T (u) := (F−11 (u1), . . . ,F−1
d (ud )), Fi cdf of θi
then
cn(x) =
∫[0,1]d
f (x ,T (u))Φn(T (u))du
=
∫[0,1]d
( ∑l∈J ,i∈Il
αl,i(x)ϕl,i(u))Φn(T (u))du
=∑
l∈J ,i∈Il
αl,i(x)d∏
j=1
∫[0,1]
Φj(F−1j (uj))ϕlj ,ij (uj)duj
![Page 40: Multilevel stochastic collocations with dimensionality reduction - … · 2017. 1. 27. · TUM, Chair of Scientific Computing in Computer Science (I5) 27.01.2017. Outline 1 Motivation](https://reader036.fdocuments.net/reader036/viewer/2022071107/5fe2199c08b91c3b94210435/html5/thumbnails/40.jpg)
gPC coefficients computation
iff (x ,T (u)) ≈ f (x ,T (u)) =
∑l∈J ,i∈Il
αl,iϕl,i (u)
T (u) := (F−11 (u1), . . . ,F−1
d (ud )), Fi cdf of θi
then
cn(x) =
∫[0,1]d
f (x ,T (u))Φn(T (u))du
=
∫[0,1]d
( ∑l∈J ,i∈Il
αl,i(x)ϕl,i(u))Φn(T (u))du
=∑
l∈J ,i∈Il
αl,i(x)d∏
j=1
∫[0,1]
Φj(F−1j (uj))ϕlj ,ij (uj)duj
![Page 41: Multilevel stochastic collocations with dimensionality reduction - … · 2017. 1. 27. · TUM, Chair of Scientific Computing in Computer Science (I5) 27.01.2017. Outline 1 Motivation](https://reader036.fdocuments.net/reader036/viewer/2022071107/5fe2199c08b91c3b94210435/html5/thumbnails/41.jpg)
Multilevel approaches
“monolevelapproach”can we furtherreduce thecomputational cost?use multilevelapproaches
![Page 42: Multilevel stochastic collocations with dimensionality reduction - … · 2017. 1. 27. · TUM, Chair of Scientific Computing in Computer Science (I5) 27.01.2017. Outline 1 Motivation](https://reader036.fdocuments.net/reader036/viewer/2022071107/5fe2199c08b91c3b94210435/html5/thumbnails/42.jpg)
Multilevel stochastic collocation: no refinement
![Page 43: Multilevel stochastic collocations with dimensionality reduction - … · 2017. 1. 27. · TUM, Chair of Scientific Computing in Computer Science (I5) 27.01.2017. Outline 1 Motivation](https://reader036.fdocuments.net/reader036/viewer/2022071107/5fe2199c08b91c3b94210435/html5/thumbnails/43.jpg)
Multilevel stochastic collocation: with refinement
![Page 44: Multilevel stochastic collocations with dimensionality reduction - … · 2017. 1. 27. · TUM, Chair of Scientific Computing in Computer Science (I5) 27.01.2017. Outline 1 Motivation](https://reader036.fdocuments.net/reader036/viewer/2022071107/5fe2199c08b91c3b94210435/html5/thumbnails/44.jpg)
Multilevel gPC coefficients
let Mh denote the level of the deterministic domain discretizationlet Ll denote the sparse grid level
let cMh,Lln (x) denote the gPC coefficient computed using adeterministic grid of level Mhsparse grid of level Ll
then, for K + 1 levels
cMK ,LKn (x) =cM0,LK
n (x)
+ (cM1,LK−1n (x)− cM0,LK−1
n (x))+
...
+ (cMK ,L0n (x)− cMK−1,L0
n (x))
if nested sparse grids, cMK−l ,Ll−1n (x) ⊂ cMK−l ,Ll
n (x)
![Page 45: Multilevel stochastic collocations with dimensionality reduction - … · 2017. 1. 27. · TUM, Chair of Scientific Computing in Computer Science (I5) 27.01.2017. Outline 1 Motivation](https://reader036.fdocuments.net/reader036/viewer/2022071107/5fe2199c08b91c3b94210435/html5/thumbnails/45.jpg)
Multilevel gPC coefficients
let Mh denote the level of the deterministic domain discretizationlet Ll denote the sparse grid level
let cMh,Lln (x) denote the gPC coefficient computed using adeterministic grid of level Mhsparse grid of level Ll
then, for K + 1 levels
cMK ,LKn (x) =cM0,LK
n (x)
+ (cM1,LK−1n (x)− cM0,LK−1
n (x))+
...
+ (cMK ,L0n (x)− cMK−1,L0
n (x))
if nested sparse grids, cMK−l ,Ll−1n (x) ⊂ cMK−l ,Ll
n (x)
![Page 46: Multilevel stochastic collocations with dimensionality reduction - … · 2017. 1. 27. · TUM, Chair of Scientific Computing in Computer Science (I5) 27.01.2017. Outline 1 Motivation](https://reader036.fdocuments.net/reader036/viewer/2022071107/5fe2199c08b91c3b94210435/html5/thumbnails/46.jpg)
Multilevel gPC coefficients
let Mh denote the level of the deterministic domain discretizationlet Ll denote the sparse grid level
let cMh,Lln (x) denote the gPC coefficient computed using adeterministic grid of level Mhsparse grid of level Ll
then, for K + 1 levels
cMK ,LKn (x) =cM0,LK
n (x)
+ (cM1,LK−1n (x)− cM0,LK−1
n (x))+
...
+ (cMK ,L0n (x)− cMK−1,L0
n (x))
if nested sparse grids, cMK−l ,Ll−1n (x) ⊂ cMK−l ,Ll
n (x)
![Page 47: Multilevel stochastic collocations with dimensionality reduction - … · 2017. 1. 27. · TUM, Chair of Scientific Computing in Computer Science (I5) 27.01.2017. Outline 1 Motivation](https://reader036.fdocuments.net/reader036/viewer/2022071107/5fe2199c08b91c3b94210435/html5/thumbnails/47.jpg)
Stochastic dimensionality reduction
each uncertain input has a different contribution to the outputuncertainty
some inputs contribute very little→ they can be “ignored” (takenas deterministic)use sensitivity information to determine each input’s contributionin the multilevel scheme, given Kc < K and τ ∈ [0,1] (e.g. τ = 5%)
if STi (x) ≤ τ , “ignore” input i
determine the new stochastic dimensionality“project” computed result on the new (sparse) grid
![Page 48: Multilevel stochastic collocations with dimensionality reduction - … · 2017. 1. 27. · TUM, Chair of Scientific Computing in Computer Science (I5) 27.01.2017. Outline 1 Motivation](https://reader036.fdocuments.net/reader036/viewer/2022071107/5fe2199c08b91c3b94210435/html5/thumbnails/48.jpg)
Stochastic dimensionality reduction
each uncertain input has a different contribution to the outputuncertaintysome inputs contribute very little→ they can be “ignored” (takenas deterministic)
use sensitivity information to determine each input’s contributionin the multilevel scheme, given Kc < K and τ ∈ [0,1] (e.g. τ = 5%)
if STi (x) ≤ τ , “ignore” input i
determine the new stochastic dimensionality“project” computed result on the new (sparse) grid
![Page 49: Multilevel stochastic collocations with dimensionality reduction - … · 2017. 1. 27. · TUM, Chair of Scientific Computing in Computer Science (I5) 27.01.2017. Outline 1 Motivation](https://reader036.fdocuments.net/reader036/viewer/2022071107/5fe2199c08b91c3b94210435/html5/thumbnails/49.jpg)
Stochastic dimensionality reduction
each uncertain input has a different contribution to the outputuncertaintysome inputs contribute very little→ they can be “ignored” (takenas deterministic)use sensitivity information to determine each input’s contribution
in the multilevel scheme, given Kc < K and τ ∈ [0,1] (e.g. τ = 5%)
if STi (x) ≤ τ , “ignore” input i
determine the new stochastic dimensionality“project” computed result on the new (sparse) grid
![Page 50: Multilevel stochastic collocations with dimensionality reduction - … · 2017. 1. 27. · TUM, Chair of Scientific Computing in Computer Science (I5) 27.01.2017. Outline 1 Motivation](https://reader036.fdocuments.net/reader036/viewer/2022071107/5fe2199c08b91c3b94210435/html5/thumbnails/50.jpg)
Stochastic dimensionality reduction
each uncertain input has a different contribution to the outputuncertaintysome inputs contribute very little→ they can be “ignored” (takenas deterministic)use sensitivity information to determine each input’s contributionin the multilevel scheme, given Kc < K and τ ∈ [0,1] (e.g. τ = 5%)
if STi (x) ≤ τ , “ignore” input i
determine the new stochastic dimensionality“project” computed result on the new (sparse) grid
![Page 51: Multilevel stochastic collocations with dimensionality reduction - … · 2017. 1. 27. · TUM, Chair of Scientific Computing in Computer Science (I5) 27.01.2017. Outline 1 Motivation](https://reader036.fdocuments.net/reader036/viewer/2022071107/5fe2199c08b91c3b94210435/html5/thumbnails/51.jpg)
Sparse grid projection
if input k ,1 ≤ i ≤ d is “ignored” and uk is the correspondingdeterministic value
f (x ,T (u)) =∑
l∈J ,i∈Il
αl,iϕl,i(u)
=∑
l∈J ,i∈Il
αl,i
d∏j=1
ϕlj ,ij (uj)
=∑
l∈J ,i∈Il
αl,iϕlk ,ik (uk )d−1∏j=1
ϕlj ,ij (uj)
=∑
l∈J ′,i∈I′l
α′l,iϕ′l,i(u)
![Page 52: Multilevel stochastic collocations with dimensionality reduction - … · 2017. 1. 27. · TUM, Chair of Scientific Computing in Computer Science (I5) 27.01.2017. Outline 1 Motivation](https://reader036.fdocuments.net/reader036/viewer/2022071107/5fe2199c08b91c3b94210435/html5/thumbnails/52.jpg)
Test scenario
d2ydt2 (t) + c dy
dt (t) + ky(t) = f cos(wt)y(0) = y0dydt (0) = y1
t ∈ [0,20], w = 1.05
five uncertain inputsdamping coefficient c ∼ U(0.08,0.12)spring constant k ∼ U(0.03,0.04)forcing amplitude f ∼ U(0.08,0.12)initial position y0 ∼ U(0.45,0.55)initial velocity y1 ∼ U(−0.05,0.05)
underdamped regime
![Page 53: Multilevel stochastic collocations with dimensionality reduction - … · 2017. 1. 27. · TUM, Chair of Scientific Computing in Computer Science (I5) 27.01.2017. Outline 1 Motivation](https://reader036.fdocuments.net/reader036/viewer/2022071107/5fe2199c08b91c3b94210435/html5/thumbnails/53.jpg)
Test scenario
d2ydt2 (t) + c dy
dt (t) + ky(t) = f cos(wt)y(0) = y0dydt (0) = y1
t ∈ [0,20], w = 1.05five uncertain inputs
damping coefficient c ∼ U(0.08,0.12)spring constant k ∼ U(0.03,0.04)forcing amplitude f ∼ U(0.08,0.12)initial position y0 ∼ U(0.45,0.55)initial velocity y1 ∼ U(−0.05,0.05)
underdamped regime
![Page 54: Multilevel stochastic collocations with dimensionality reduction - … · 2017. 1. 27. · TUM, Chair of Scientific Computing in Computer Science (I5) 27.01.2017. Outline 1 Motivation](https://reader036.fdocuments.net/reader036/viewer/2022071107/5fe2199c08b91c3b94210435/html5/thumbnails/54.jpg)
Tests setup
sparse grid functionality: SG++1
finite difference discretizationuniform inputs→ Legendre polynomialsmodified polynomial basis functions of deg 2tinterest = 10reference results with 32768 Gauss-Legendre nodesmultilevel approach with K = 2
cM2,L2n (x) = cM0,L2
n (x) + (cM1,L1n (x)− cM0,L1
n (x))
+ (cM2,L0n (x)− cM1,L0
n (x))
when using refinement, Li , i = 1,2 means L0 with i refinementsteps
1http://sgpp.sparsegrids.org/
![Page 55: Multilevel stochastic collocations with dimensionality reduction - … · 2017. 1. 27. · TUM, Chair of Scientific Computing in Computer Science (I5) 27.01.2017. Outline 1 Motivation](https://reader036.fdocuments.net/reader036/viewer/2022071107/5fe2199c08b91c3b94210435/html5/thumbnails/55.jpg)
Tests setup
sparse grid functionality: SG++1
finite difference discretization
uniform inputs→ Legendre polynomialsmodified polynomial basis functions of deg 2tinterest = 10reference results with 32768 Gauss-Legendre nodesmultilevel approach with K = 2
cM2,L2n (x) = cM0,L2
n (x) + (cM1,L1n (x)− cM0,L1
n (x))
+ (cM2,L0n (x)− cM1,L0
n (x))
when using refinement, Li , i = 1,2 means L0 with i refinementsteps
1http://sgpp.sparsegrids.org/
![Page 56: Multilevel stochastic collocations with dimensionality reduction - … · 2017. 1. 27. · TUM, Chair of Scientific Computing in Computer Science (I5) 27.01.2017. Outline 1 Motivation](https://reader036.fdocuments.net/reader036/viewer/2022071107/5fe2199c08b91c3b94210435/html5/thumbnails/56.jpg)
Tests setup
sparse grid functionality: SG++1
finite difference discretizationuniform inputs→ Legendre polynomialsmodified polynomial basis functions of deg 2
tinterest = 10reference results with 32768 Gauss-Legendre nodesmultilevel approach with K = 2
cM2,L2n (x) = cM0,L2
n (x) + (cM1,L1n (x)− cM0,L1
n (x))
+ (cM2,L0n (x)− cM1,L0
n (x))
when using refinement, Li , i = 1,2 means L0 with i refinementsteps
1http://sgpp.sparsegrids.org/
![Page 57: Multilevel stochastic collocations with dimensionality reduction - … · 2017. 1. 27. · TUM, Chair of Scientific Computing in Computer Science (I5) 27.01.2017. Outline 1 Motivation](https://reader036.fdocuments.net/reader036/viewer/2022071107/5fe2199c08b91c3b94210435/html5/thumbnails/57.jpg)
Tests setup
sparse grid functionality: SG++1
finite difference discretizationuniform inputs→ Legendre polynomialsmodified polynomial basis functions of deg 2tinterest = 10
reference results with 32768 Gauss-Legendre nodesmultilevel approach with K = 2
cM2,L2n (x) = cM0,L2
n (x) + (cM1,L1n (x)− cM0,L1
n (x))
+ (cM2,L0n (x)− cM1,L0
n (x))
when using refinement, Li , i = 1,2 means L0 with i refinementsteps
1http://sgpp.sparsegrids.org/
![Page 58: Multilevel stochastic collocations with dimensionality reduction - … · 2017. 1. 27. · TUM, Chair of Scientific Computing in Computer Science (I5) 27.01.2017. Outline 1 Motivation](https://reader036.fdocuments.net/reader036/viewer/2022071107/5fe2199c08b91c3b94210435/html5/thumbnails/58.jpg)
Tests setup
sparse grid functionality: SG++1
finite difference discretizationuniform inputs→ Legendre polynomialsmodified polynomial basis functions of deg 2tinterest = 10reference results with 32768 Gauss-Legendre nodes
multilevel approach with K = 2
cM2,L2n (x) = cM0,L2
n (x) + (cM1,L1n (x)− cM0,L1
n (x))
+ (cM2,L0n (x)− cM1,L0
n (x))
when using refinement, Li , i = 1,2 means L0 with i refinementsteps
1http://sgpp.sparsegrids.org/
![Page 59: Multilevel stochastic collocations with dimensionality reduction - … · 2017. 1. 27. · TUM, Chair of Scientific Computing in Computer Science (I5) 27.01.2017. Outline 1 Motivation](https://reader036.fdocuments.net/reader036/viewer/2022071107/5fe2199c08b91c3b94210435/html5/thumbnails/59.jpg)
Tests setup
sparse grid functionality: SG++1
finite difference discretizationuniform inputs→ Legendre polynomialsmodified polynomial basis functions of deg 2tinterest = 10reference results with 32768 Gauss-Legendre nodesmultilevel approach with K = 2
cM2,L2n (x) = cM0,L2
n (x) + (cM1,L1n (x)− cM0,L1
n (x))
+ (cM2,L0n (x)− cM1,L0
n (x))
when using refinement, Li , i = 1,2 means L0 with i refinementsteps
1http://sgpp.sparsegrids.org/
![Page 60: Multilevel stochastic collocations with dimensionality reduction - … · 2017. 1. 27. · TUM, Chair of Scientific Computing in Computer Science (I5) 27.01.2017. Outline 1 Motivation](https://reader036.fdocuments.net/reader036/viewer/2022071107/5fe2199c08b91c3b94210435/html5/thumbnails/60.jpg)
Tests setup
sparse grid functionality: SG++1
finite difference discretizationuniform inputs→ Legendre polynomialsmodified polynomial basis functions of deg 2tinterest = 10reference results with 32768 Gauss-Legendre nodesmultilevel approach with K = 2
cM2,L2n (x) = cM0,L2
n (x) + (cM1,L1n (x)− cM0,L1
n (x))
+ (cM2,L0n (x)− cM1,L0
n (x))
when using refinement, Li , i = 1,2 means L0 with i refinementsteps
1http://sgpp.sparsegrids.org/
![Page 61: Multilevel stochastic collocations with dimensionality reduction - … · 2017. 1. 27. · TUM, Chair of Scientific Computing in Computer Science (I5) 27.01.2017. Outline 1 Motivation](https://reader036.fdocuments.net/reader036/viewer/2022071107/5fe2199c08b91c3b94210435/html5/thumbnails/61.jpg)
Tests setup
M0 = 500,M1 = 2000,M2 = 8000reference results
Eref [y(10)] = −0.155165Varref [y(10)] = 0.0002267
error measurement
err =∣∣∣qoiref − qoi
qoiref
∣∣∣
![Page 62: Multilevel stochastic collocations with dimensionality reduction - … · 2017. 1. 27. · TUM, Chair of Scientific Computing in Computer Science (I5) 27.01.2017. Outline 1 Motivation](https://reader036.fdocuments.net/reader036/viewer/2022071107/5fe2199c08b91c3b94210435/html5/thumbnails/62.jpg)
Tests setup
M0 = 500,M1 = 2000,M2 = 8000reference results
Eref [y(10)] = −0.155165Varref [y(10)] = 0.0002267
error measurement
err =∣∣∣qoiref − qoi
qoiref
∣∣∣
![Page 63: Multilevel stochastic collocations with dimensionality reduction - … · 2017. 1. 27. · TUM, Chair of Scientific Computing in Computer Science (I5) 27.01.2017. Outline 1 Motivation](https://reader036.fdocuments.net/reader036/viewer/2022071107/5fe2199c08b91c3b94210435/html5/thumbnails/63.jpg)
Test case 1: no dimensionality reduction
L0 L1 L2 ref % err exp err var11 71 351 - 4.2012e-06 1.3147e-0471 351 1471 - 7.7759e-07 1.5950e-0511 31 67 20% 4.0983e-04 7.8192e-0471 191 423 20% 1.4499e-06 2.8220e-0571 230 655 30% 7.8551e-07 1.4199e-05
![Page 64: Multilevel stochastic collocations with dimensionality reduction - … · 2017. 1. 27. · TUM, Chair of Scientific Computing in Computer Science (I5) 27.01.2017. Outline 1 Motivation](https://reader036.fdocuments.net/reader036/viewer/2022071107/5fe2199c08b91c3b94210435/html5/thumbnails/64.jpg)
Test case 2: dimensionality reduction
compute Sobol’ indices for cM1,L1n (x), i.e. start with K = 1
τ = 5%
err expectation ∈ O(10−3)
L0 L1 L2 ref % ST1 ST
2 ST3 ST
4 ST5
5 71 49 - 4.1% 1.2% 56.5% 4.8% 34.0%
17 351 129 - 4.1% 1.2% 56.5% 4.8% 34.0%
5 31 13 20% 4.1% 0.65% 56.7% 4.8% 33.9%
17 191 45 20% 4.1% 1.2% 56.5% 4.8% 34.0%
17 230 67 30% 4.0% 1.2% 56.6% 4.8% 34.0%
![Page 65: Multilevel stochastic collocations with dimensionality reduction - … · 2017. 1. 27. · TUM, Chair of Scientific Computing in Computer Science (I5) 27.01.2017. Outline 1 Motivation](https://reader036.fdocuments.net/reader036/viewer/2022071107/5fe2199c08b91c3b94210435/html5/thumbnails/65.jpg)
Test case 3: dimensionality reduction
compute Sobol’ indices for cM0,L2n (x) + (cM1,L1
n (x)− cM0,L1n (x))
τ = 5%
err expectation ∈ O(10−4)
L0 L1 L2 ref % ST1 ST
2 ST3 ST
4 ST5
5 71 351 - 4.1% 1.2% 56.5% 4.8% 34.0%
17 351 1471 - 4.1% 1.2% 56.5% 4.8% 34.0%
5 31 67 20% 4.1% 1.2% 56.5% 4.8% 34.0%
17 191 423 20% 4.0% 1.2% 56.6% 4.8% 34.0%
17 230 655 30% 4.0% 1.2% 56.6% 4.8% 34.0%
![Page 66: Multilevel stochastic collocations with dimensionality reduction - … · 2017. 1. 27. · TUM, Chair of Scientific Computing in Computer Science (I5) 27.01.2017. Outline 1 Motivation](https://reader036.fdocuments.net/reader036/viewer/2022071107/5fe2199c08b91c3b94210435/html5/thumbnails/66.jpg)
Discussion
from polynomial chaos coefficients, we can compute mean,variance, Sobol’ (sensitivity) indices etc.spatially adaptive sparse grids suitable to delay the curse ofdimensionalitymultilevel ideas can further reduce the computational costuncertain inputs that contribute little to output uncertainty can beignoredall of these should be used with care
![Page 67: Multilevel stochastic collocations with dimensionality reduction - … · 2017. 1. 27. · TUM, Chair of Scientific Computing in Computer Science (I5) 27.01.2017. Outline 1 Motivation](https://reader036.fdocuments.net/reader036/viewer/2022071107/5fe2199c08b91c3b94210435/html5/thumbnails/67.jpg)
Thank you for your attention!