Advanced Statistical Computing · 2020-06-18 · mis obs obs mis y y y y y mis,i | y obs, y mis,[...

50
Advanced Statistical Computing Fall 2018 Steve Qin

Transcript of Advanced Statistical Computing · 2020-06-18 · mis obs obs mis y y y y y mis,i | y obs, y mis,[...

Page 1: Advanced Statistical Computing · 2020-06-18 · mis obs obs mis y y y y y mis,i | y obs, y mis,[ i]. Remarks ... can influence the final result. 49. ( ) 1 ( ) ˆ ( ) ... • More

Advanced Statistical Computing

Fall 2018

Steve Qin

Page 2: Advanced Statistical Computing · 2020-06-18 · mis obs obs mis y y y y y mis,i | y obs, y mis,[ i]. Remarks ... can influence the final result. 49. ( ) 1 ( ) ˆ ( ) ... • More

2

Metropolis-Hastings Algorithm

),()(

),()(

yxTx

xyTyr

tt

t

Page 3: Advanced Statistical Computing · 2020-06-18 · mis obs obs mis y y y y y mis,i | y obs, y mis,[ i]. Remarks ... can influence the final result. 49. ( ) 1 ( ) ˆ ( ) ... • More

3

Illustration of Metropolis-Hastings

• At each step, calculate

since

1 1( , )

( , )

t t

t t

x yr

x y

2

1 1 1 1(( , ), ( , )) (( , ), ( , )) 1t t t t t t t tT x y x y T x y x y

2 2

1 1 1 122

2 2

22

2 2 2 2

1 1 1 12

1 1exp 2

2(1 )2 1

1 1exp 2

2(1 )2 1

1exp 2 2

2(1 )

t t t t

t t t t

t t t t t t t t

x x y y

r

x x y y

x x y y x x y y

Page 4: Advanced Statistical Computing · 2020-06-18 · mis obs obs mis y y y y y mis,i | y obs, y mis,[ i]. Remarks ... can influence the final result. 49. ( ) 1 ( ) ˆ ( ) ... • More

Summary on M-H algorithm

• Needs to know the density function of the

target distribution (not necessarily to be

complete)

• Up to a normalizing constant

• Easy to implement, ideal for homogeneous

set of parameters (not too many).

• Require burn-in

• Monitor convergence4

Page 5: Advanced Statistical Computing · 2020-06-18 · mis obs obs mis y y y y y mis,i | y obs, y mis,[ i]. Remarks ... can influence the final result. 49. ( ) 1 ( ) ˆ ( ) ... • More

5

Gibbs Sampler

• Purpose: Draw random samples form a joint

distribution (high dimensional)

• Method: Iterative conditional sampling

1 2( , ,..., ) Target ( )nx x x x x

i [ ], draw x ( | )i ii x x

Page 6: Advanced Statistical Computing · 2020-06-18 · mis obs obs mis y y y y y mis,i | y obs, y mis,[ i]. Remarks ... can influence the final result. 49. ( ) 1 ( ) ˆ ( ) ... • More

6

Illustration of Gibbs Sampler

Page 7: Advanced Statistical Computing · 2020-06-18 · mis obs obs mis y y y y y mis,i | y obs, y mis,[ i]. Remarks ... can influence the final result. 49. ( ) 1 ( ) ˆ ( ) ... • More

7

Page 8: Advanced Statistical Computing · 2020-06-18 · mis obs obs mis y y y y y mis,i | y obs, y mis,[ i]. Remarks ... can influence the final result. 49. ( ) 1 ( ) ˆ ( ) ... • More

8

References

• Geman and Geman 1984,

• Gelfand and Smith 1990,

• Tutorial paper:

Casella and George (1992). Explaining the

Gibbs sampler. The American Statistician,

46, 167-174.

Page 9: Advanced Statistical Computing · 2020-06-18 · mis obs obs mis y y y y y mis,i | y obs, y mis,[ i]. Remarks ... can influence the final result. 49. ( ) 1 ( ) ˆ ( ) ... • More

Summary of Gibbs sampler

• Special case of MH algorithm

• The most popular MCMC method,

generally more efficient than MH.

• Require some mathematical derivation.

• Verify convergence of the sequence

• Use multiple chains

• Be careful of pseudo Gibbs sampler!

9

Page 10: Advanced Statistical Computing · 2020-06-18 · mis obs obs mis y y y y y mis,i | y obs, y mis,[ i]. Remarks ... can influence the final result. 49. ( ) 1 ( ) ˆ ( ) ... • More

Collapsing, predictive updating

Page 11: Advanced Statistical Computing · 2020-06-18 · mis obs obs mis y y y y y mis,i | y obs, y mis,[ i]. Remarks ... can influence the final result. 49. ( ) 1 ( ) ˆ ( ) ... • More

Collapsing and grouping

• Want to sample from

• Regular Gibbs sampler:

– Sample x1(t+1) from

– Sample x2(t+1) from

– …

– Sample xd(t+1) from

• Alternatively:

– Grouping:

– Collapsing, i.e., integrate out xd:

11

),...,,( 21 dxxxx

),,...,,|( )()(

3

)(

2

)1(

1

t

d

ttt xxxx

),,...,,|( )()(

3

)(

1

)1(

2

t

d

ttt xxxx

),,...,,|( )(

1

)(

3

)(

2

)1( t

d

ttt

d xxxx

).,(' 11 ddd xx x

),...,,( 121

dxxxx

Page 12: Advanced Statistical Computing · 2020-06-18 · mis obs obs mis y y y y y mis,i | y obs, y mis,[ i]. Remarks ... can influence the final result. 49. ( ) 1 ( ) ˆ ( ) ... • More

The three-schemes

12

standard grouping collapsing

Page 13: Advanced Statistical Computing · 2020-06-18 · mis obs obs mis y y y y y mis,i | y obs, y mis,[ i]. Remarks ... can influence the final result. 49. ( ) 1 ( ) ˆ ( ) ... • More

Some theory

• Hilbert space L2(π) of functions h().

• Define thus

• Define forward operator F as

• The convergence of Markov chains is tied to the

norms of the corresponding forward operators.

13

,)()(, xgxhEgh ).(var hh

.|)(),()( )()1( xxxhEdyyhyxKxFh tt

.1)( with fucntions allfor )(sup 2 hExFhF h

Page 14: Advanced Statistical Computing · 2020-06-18 · mis obs obs mis y y y y y mis,i | y obs, y mis,[ i]. Remarks ... can influence the final result. 49. ( ) 1 ( ) ˆ ( ) ... • More

Three-scheme theorem

• Standard Fs:

• Grouping Fg:

• Collapsing Fc:

Theorem The norms of the three forward

operators are ordered as

14

;21 dxxx

;,121 dd xxxx

.121 dxxx

sgc FFF

Page 15: Advanced Statistical Computing · 2020-06-18 · mis obs obs mis y y y y y mis,i | y obs, y mis,[ i]. Remarks ... can influence the final result. 49. ( ) 1 ( ) ˆ ( ) ... • More

Examples• Murray’s data

• Bivariate Gaussian with mean 0 and

unknown covariance matrix Σ

standard collapsing

15

.,|

,,|

obsmis

misobs

yy

yy .,| ][,, imisobsimis yyy

Page 16: Advanced Statistical Computing · 2020-06-18 · mis obs obs mis y y y y y mis,i | y obs, y mis,[ i]. Remarks ... can influence the final result. 49. ( ) 1 ( ) ˆ ( ) ... • More

Remarks

• Avoid introducing unnecessary parameters

into a Gibbs sampler,

• Do as much analytical work as possible,

• However, introducing some clever auxiliary

variables can greatly improve computation

efficiency.

16

Page 17: Advanced Statistical Computing · 2020-06-18 · mis obs obs mis y y y y y mis,i | y obs, y mis,[ i]. Remarks ... can influence the final result. 49. ( ) 1 ( ) ˆ ( ) ... • More

17

Page 18: Advanced Statistical Computing · 2020-06-18 · mis obs obs mis y y y y y mis,i | y obs, y mis,[ i]. Remarks ... can influence the final result. 49. ( ) 1 ( ) ˆ ( ) ... • More

18

Page 19: Advanced Statistical Computing · 2020-06-18 · mis obs obs mis y y y y y mis,i | y obs, y mis,[ i]. Remarks ... can influence the final result. 49. ( ) 1 ( ) ˆ ( ) ... • More

19

Page 20: Advanced Statistical Computing · 2020-06-18 · mis obs obs mis y y y y y mis,i | y obs, y mis,[ i]. Remarks ... can influence the final result. 49. ( ) 1 ( ) ˆ ( ) ... • More

20

Page 21: Advanced Statistical Computing · 2020-06-18 · mis obs obs mis y y y y y mis,i | y obs, y mis,[ i]. Remarks ... can influence the final result. 49. ( ) 1 ( ) ˆ ( ) ... • More

21

Page 22: Advanced Statistical Computing · 2020-06-18 · mis obs obs mis y y y y y mis,i | y obs, y mis,[ i]. Remarks ... can influence the final result. 49. ( ) 1 ( ) ˆ ( ) ... • More

22

Page 23: Advanced Statistical Computing · 2020-06-18 · mis obs obs mis y y y y y mis,i | y obs, y mis,[ i]. Remarks ... can influence the final result. 49. ( ) 1 ( ) ˆ ( ) ... • More

23

Page 24: Advanced Statistical Computing · 2020-06-18 · mis obs obs mis y y y y y mis,i | y obs, y mis,[ i]. Remarks ... can influence the final result. 49. ( ) 1 ( ) ˆ ( ) ... • More

Sequential Monte Carlo

Page 25: Advanced Statistical Computing · 2020-06-18 · mis obs obs mis y y y y y mis,i | y obs, y mis,[ i]. Remarks ... can influence the final result. 49. ( ) 1 ( ) ˆ ( ) ... • More

Accept-reject method

• The accept-reject method

1. Generate X ~ g, U ~ Uniform(0,1),

2. Accept Y = X if U ≤ f(X)/Mg(X),

3. Repeat.

25

Page 26: Advanced Statistical Computing · 2020-06-18 · mis obs obs mis y y y y y mis,i | y obs, y mis,[ i]. Remarks ... can influence the final result. 49. ( ) 1 ( ) ˆ ( ) ... • More

Accept-reject method example

• Beta (α,β), α ≥ 1, β ≥ 1,

simulate Y ~ Uniform(0,1) and

U ~ Uniform(0,m),

m is the max of the Beta density.

select X = Y if under curve

what is the acceptance rate?

26

Page 27: Advanced Statistical Computing · 2020-06-18 · mis obs obs mis y y y y y mis,i | y obs, y mis,[ i]. Remarks ... can influence the final result. 49. ( ) 1 ( ) ˆ ( ) ... • More

Importance sampling

• A variance reduction technique.

• Certain values of the input random variables have

more impact on the parameter being estimated than

others.

• If these "important" values are emphasized by

sampling more frequently, then the estimator variance

can be reduced.

• Marshall (1956) suggested that one should focus on

the region(s) of “importance” so as to save

computational resource.

• Essential in high-dimensional models.27

Page 28: Advanced Statistical Computing · 2020-06-18 · mis obs obs mis y y y y y mis,i | y obs, y mis,[ i]. Remarks ... can influence the final result. 49. ( ) 1 ( ) ˆ ( ) ... • More

Importance sampling

• Importance sampling:

to evaluate

based on generating a sample from

a given distribution g and approximating

which is based on

28

dxxfxhXhE f )()()]([

nXX ,,1

m

j

j

j

j

f XhXg

Xf

mXhE

1

)()(

)(1)]([

dxxgxg

xfxhXhE f )(

)(

)()()]([

Page 29: Advanced Statistical Computing · 2020-06-18 · mis obs obs mis y y y y y mis,i | y obs, y mis,[ i]. Remarks ... can influence the final result. 49. ( ) 1 ( ) ˆ ( ) ... • More

The algorithm

• To evaluate

– Draw from a trial distribution g().

– Calculate the importance weight

– Approximate μ by

• Remark: is better than the unbiased

estimator

why? 29

.)()()]([

dxxxhXhE

)()1( ,..., mxx

.,...,1for ,)(/ )()()( mjxgxw jjj

.)()(

ˆ)()1(

)()()1()1(

m

mm

ww

xhwxhw

)}.()({1~ )()()1()1( mm xhwxhwm

Page 30: Advanced Statistical Computing · 2020-06-18 · mis obs obs mis y y y y y mis,i | y obs, y mis,[ i]. Remarks ... can influence the final result. 49. ( ) 1 ( ) ˆ ( ) ... • More

The basic methodology of

importance sampling

• To choose a distribution which "encourages"

the important values.

• This use of "biased" distributions will result

in a biased estimator if it is applied directly.

• Weight to correct for the use of the biased

distribution to ensure unbiasness.

• The weight is given by the likelihood ratio,

30

Page 31: Advanced Statistical Computing · 2020-06-18 · mis obs obs mis y y y y y mis,i | y obs, y mis,[ i]. Remarks ... can influence the final result. 49. ( ) 1 ( ) ˆ ( ) ... • More

Importance sampling example

• Small tail probabilities:

Z ~ N(0,1), P(Z > 4.5)

naïve: simulate Zi ~ N(0,1), i=1,…,M.

calculate

31

M

i

iZIM

ZP1

)5.4(1

)5.4(

Page 32: Advanced Statistical Computing · 2020-06-18 · mis obs obs mis y y y y y mis,i | y obs, y mis,[ i]. Remarks ... can influence the final result. 49. ( ) 1 ( ) ˆ ( ) ... • More

Importance sampling example

Let Y~TExp(4.5,1) with density

Now simulated from fY and use importance

sampling, we obtain

32

.000003377.)5.4()(

)(1)5.4(

1

M

i

i

iY

i YIYf

Y

MZP

./)(5.4

)5.4( dxeeyf xx

Y

Page 33: Advanced Statistical Computing · 2020-06-18 · mis obs obs mis y y y y y mis,i | y obs, y mis,[ i]. Remarks ... can influence the final result. 49. ( ) 1 ( ) ˆ ( ) ... • More

Importance sampling example

## theoretical value

p0=1-pnorm(4.5)

## sample directly from normal distribution

## this needs large number of samples

z=rnorm(10000000)

p1=mean(z>4.5)

## importance sampling

n0=10000

Y=rexp(n0, 1)+4.5

a=dnorm(Y)/dexp(Y-4.5)

p2=mean(a[Y>4.5])

c(p0, p1, p2) ##

[1] 3.397673e-06 2.600000e-06 3.418534e-06

33

Page 34: Advanced Statistical Computing · 2020-06-18 · mis obs obs mis y y y y y mis,i | y obs, y mis,[ i]. Remarks ... can influence the final result. 49. ( ) 1 ( ) ˆ ( ) ... • More

Another example

• Both grid-point method and vanilla Monte Carlo

methods wasted resources on “boring” desert area.

34

Page 35: Advanced Statistical Computing · 2020-06-18 · mis obs obs mis y y y y y mis,i | y obs, y mis,[ i]. Remarks ... can influence the final result. 49. ( ) 1 ( ) ˆ ( ) ... • More

Another example

• Use proposal function

with (x,y) ϵ [1,1] x [1,1], a truncated

mixture of bivariate Gaussian

Vanilla Monte Carlo Importance Sampling

35

009.0)ˆ(

1307.0ˆ

std 0005.0)ˆ(

1259.0ˆ

std

Page 36: Advanced Statistical Computing · 2020-06-18 · mis obs obs mis y y y y y mis,i | y obs, y mis,[ i]. Remarks ... can influence the final result. 49. ( ) 1 ( ) ˆ ( ) ... • More

Sequential importance sampling

• For high dimensional problem, how to

design trial distribution is challenging.

• Suppose the target density of

can be decomposed as

then constructed trial density as

36

),...,,( 21 dxxxx

),..,|()|()()( 11121 dd xxxxxx x

),..,|()|()()( 1112211 ddd xxxgxxgxgg x

Page 37: Advanced Statistical Computing · 2020-06-18 · mis obs obs mis y y y y y mis,i | y obs, y mis,[ i]. Remarks ... can influence the final result. 49. ( ) 1 ( ) ˆ ( ) ... • More

Sequential importance sampling

Suggest a recursive way of computing and

monitoring importance weight. Denote

then we have

37

),..,|()|()(

),..,|()|()()(

1112211

11121

ddd

dd

xxxgxxgxg

xxxxxxw

x

),...,,( 21 txxxtx

)|(

)|()()( 1

-1t

-1t-1tt

x

xxx

tt

ttt

xg

xww

Page 38: Advanced Statistical Computing · 2020-06-18 · mis obs obs mis y y y y y mis,i | y obs, y mis,[ i]. Remarks ... can influence the final result. 49. ( ) 1 ( ) ˆ ( ) ... • More

Sequential importance sampling

• Advantages of the recursion scheme

– Can stop generating further components of x if

the partial weight is too small.

– Can take advantage of in designing

• However, the scheme is impractical since

requires the knowledge of marginal

distribution π(xt).

38

)|( -1txtx

)|( -1txtt xg

Page 39: Advanced Statistical Computing · 2020-06-18 · mis obs obs mis y y y y y mis,i | y obs, y mis,[ i]. Remarks ... can influence the final result. 49. ( ) 1 ( ) ˆ ( ) ... • More

Sequential importance sampling

• Add another layer of complexity:

• Introduce a sequence of “auxiliary

distributions” such that

is a reasonable approximation of the

marginal distribution , for t = 1,…,d -1

and πd = π.

• Note the πd are only required to be known up

to a normalizing constant. 39

)()()( 211 xx2 dx

)( txt

)( tx

Page 40: Advanced Statistical Computing · 2020-06-18 · mis obs obs mis y y y y y mis,i | y obs, y mis,[ i]. Remarks ... can influence the final result. 49. ( ) 1 ( ) ˆ ( ) ... • More

The SIS procedure

For t = 2,…,d,

• Draw Xt = xt from , and let

• Compute

and let wt = wt -1 ut

• ut : incremental weight.

• The key idea is to breaks a difficult task into

manageable pieces.

• If wt is getting too small, reject.40

)|( 1ttt xxg

),( tx-1tt xx

)|()(

)(

1 -1t-1t

t

xx

x

ttt

tt

xgu

Page 41: Advanced Statistical Computing · 2020-06-18 · mis obs obs mis y y y y y mis,i | y obs, y mis,[ i]. Remarks ... can influence the final result. 49. ( ) 1 ( ) ˆ ( ) ... • More

An application example of SIS

• Assume

– Constant population size N,

– Evolve in non-overlapping generation,

– The chromosomal region is sufficiently small,

– No recombination,

– “haplotype”: each chromosome only has one

parent.

41

Page 42: Advanced Statistical Computing · 2020-06-18 · mis obs obs mis y y y y y mis,i | y obs, y mis,[ i]. Remarks ... can influence the final result. 49. ( ) 1 ( ) ˆ ( ) ... • More

Population genetics example

• Notation:

– E: set of all possible genetic types,

– μ: mutation rate per chromosome per generation,

– : the mutation transition matrix,

– If a parental segment of type α ϵ E,

its progeny is

42

. prob. with

,1 prob. with

P

PP

Page 43: Advanced Statistical Computing · 2020-06-18 · mis obs obs mis y y y y y mis,i | y obs, y mis,[ i]. Remarks ... can influence the final result. 49. ( ) 1 ( ) ˆ ( ) ... • More

Example data

• From Stephens and Donnelly (2000)

• E={C,T}

• The history H = (H-k, H-(k-1),…, H-1, H0)

= ({C}, {C,C}, {C,T}, {C,C,T}, {C,T,T}, {T,T,T},

{T,T,T,T}, {C,T,T,T,T}, {C,T,T,T,T})

43

Page 44: Advanced Statistical Computing · 2020-06-18 · mis obs obs mis y y y y y mis,i | y obs, y mis,[ i]. Remarks ... can influence the final result. 49. ( ) 1 ( ) ˆ ( ) ... • More

Coalescence example

• Use H = (H-m,…, H-1, H0) to denote the whole

ancestral history (unobserved) of the 5

individuals.

• Compute the likelihood function

, π0 is the stationary distribution

of P.

44

)|stop()|()|()()( 0101 HpHHpHHpHpp kkk H

)()( 0 kk HHp

Page 45: Advanced Statistical Computing · 2020-06-18 · mis obs obs mis y y y y y mis,i | y obs, y mis,[ i]. Remarks ... can influence the final result. 49. ( ) 1 ( ) ˆ ( ) ... • More

Coalescence calculation

• For i = -(k-1),…,0

45

Page 46: Advanced Statistical Computing · 2020-06-18 · mis obs obs mis y y y y y mis,i | y obs, y mis,[ i]. Remarks ... can influence the final result. 49. ( ) 1 ( ) ˆ ( ) ... • More

Notations

• n is the sample size at generation Hi-1

• nα is the number of chromosome of type α

in the sample.

• θ=2Nμ/ν.

• N population size.

• ν2 is the variance of the number of progeny

of a random chromosome.

46

Page 47: Advanced Statistical Computing · 2020-06-18 · mis obs obs mis y y y y y mis,i | y obs, y mis,[ i]. Remarks ... can influence the final result. 49. ( ) 1 ( ) ˆ ( ) ... • More

Strategies to estimate θ

• To get MLE, we need to compute likelihood

• Naïve Monte Carlo won’t work because of

compatibility issue.

• An alternative is to simulate H backward starting

from H0 and use weight to correct bias.

47

Page 48: Advanced Statistical Computing · 2020-06-18 · mis obs obs mis y y y y y mis,i | y obs, y mis,[ i]. Remarks ... can influence the final result. 49. ( ) 1 ( ) ˆ ( ) ... • More

An SIS approach

,)'|(

)|()|(

'0

0

all 1

1

1

tH tt

tt

tttHHp

HHpHHg

• Simulate H-1, H-1,…, from a trial distribution

built up sequentially by revering the forward

sampling probability at a fixed θ0. That is, for

i =1,…,k, we have

the final trial distribution

48

)|()|()( 1011 kkk HHgHHgg H

Page 49: Advanced Statistical Computing · 2020-06-18 · mis obs obs mis y y y y y mis,i | y obs, y mis,[ i]. Remarks ... can influence the final result. 49. ( ) 1 ( ) ˆ ( ) ... • More

An SIS approach

• By simulating from g() multiple copies of

the history, H(j) , j=1,…,m, we can

approximate the likelihood function as

• Note the choice of θ0 can influence the final

result.

49

.)(

)(1)(ˆ

1)(

)(

0

m

jj

j

Hg

Hp

mHp

Page 50: Advanced Statistical Computing · 2020-06-18 · mis obs obs mis y y y y y mis,i | y obs, y mis,[ i]. Remarks ... can influence the final result. 49. ( ) 1 ( ) ˆ ( ) ... • More

Other examples of SIS

• Growing a polymer

– Self avoid walk

• Sequential imputation

for statistical missing

data problem.

• More and details of

these examples, see

Liu 2001. 50