Marginalized Samplers for Normalized Random Measure ...teh/research/npbayes/FavTeh...k for k =...

52
Marginalized Samplers for Normalized Random Measure Mixture Models Yee Whye Teh and Stefano Favaro University of Oxford and University of Torino 2013 1 / 40

Transcript of Marginalized Samplers for Normalized Random Measure ...teh/research/npbayes/FavTeh...k for k =...

Page 1: Marginalized Samplers for Normalized Random Measure ...teh/research/npbayes/FavTeh...k for k = 1;:::;M. k ˘H I Drawn before each Gibbs update. I Exists only during the Gibbs update,

Marginalized Samplers forNormalized Random Measure Mixture Models

Yee Whye Teh and Stefano Favaro

University of Oxford and University of Torino

2013

1 / 40

Page 2: Marginalized Samplers for Normalized Random Measure ...teh/research/npbayes/FavTeh...k for k = 1;:::;M. k ˘H I Drawn before each Gibbs update. I Exists only during the Gibbs update,

Outline

Dirichlet Process Mixture Models

Normalized Random Measures

MCMC Samplers for NRM Mixture Models

Numerical Illustrations

2 / 40

Page 3: Marginalized Samplers for Normalized Random Measure ...teh/research/npbayes/FavTeh...k for k = 1;:::;M. k ˘H I Drawn before each Gibbs update. I Exists only during the Gibbs update,

Outline

Dirichlet Process Mixture Models

Normalized Random Measures

MCMC Samplers for NRM Mixture Models

Numerical Illustrations

3 / 40

Page 4: Marginalized Samplers for Normalized Random Measure ...teh/research/npbayes/FavTeh...k for k = 1;:::;M. k ˘H I Drawn before each Gibbs update. I Exists only during the Gibbs update,

Dirichlet Process

I Random probability measure µ ∼ DP(α,H).I For each partition (A1, . . . ,Am),

(µ(A1), . . . , µ(Am)) ∼ Dirichlet(αH(A1), . . . , αH(Am))

I Draws from Dirichlet processes are discrete probability measures,

µ =∞∑

k=1

wkδφ∗k

where wk , φ∗k are random.

I Large support over space of probability measures.I Analytically tractable posterior distribution.

[Ferguson 1973] 4 / 40

Page 5: Marginalized Samplers for Normalized Random Measure ...teh/research/npbayes/FavTeh...k for k = 1;:::;M. k ˘H I Drawn before each Gibbs update. I Exists only during the Gibbs update,

Dirichlet Process Mixture Models

I A hierarchical model, with sample x1:n = (x1, . . . , xn):

µ ∼ DP(α,H)

φi |µ ∼ µxi |φi ∼ F (φi)

I Discrete nature of µ induces repeated values among φ1:n.I Induces a partition π of observations x1:n.I Each cluster c ∈ π corresponds to a distinct value φ∗c .I Leads to a clustering model with a varying/infinite number of

clusters.I Properties of model for cluster analysis depends on the properties

of the induced random partition π (a CRP).I Generalisations of DPs allow for more flexible prior specifications.

[Antoniak 1974, Lo 1984] 5 / 40

Page 6: Marginalized Samplers for Normalized Random Measure ...teh/research/npbayes/FavTeh...k for k = 1;:::;M. k ˘H I Drawn before each Gibbs update. I Exists only during the Gibbs update,

MCMC Inference in DP Mixtures

I Conditional samplers [Ishwaran and James 2001, Walker 2007,Kalli and Walker 2006, Papaspiliopoulos and Roberts 2008]

I Simulate from the joint posterior of µ and φ1:n.I Difficulty is in the infinite nature of µ, requiring truncations and

retrospective sampling techniques.I Easier to understand, parallelizable.

I Marginal samplers [Escobar and West 1995,Bush and MacEachern 1996, Neal 2000]

I Marginalize out µ, and simulate from posterior for π and {φ∗c}.

π ∼ CRP(α)

φ∗c ∼ H for c ∈ πxi |π, {φ∗c} ∼ F (φ∗c ) for c ∈ π with i ∈ c

I Generally better mixing.I Only developed for models with analytically known EPPFs so far.

6 / 40

Page 7: Marginalized Samplers for Normalized Random Measure ...teh/research/npbayes/FavTeh...k for k = 1;:::;M. k ˘H I Drawn before each Gibbs update. I Exists only during the Gibbs update,

Outline

Dirichlet Process Mixture Models

Normalized Random Measures

MCMC Samplers for NRM Mixture Models

Numerical Illustrations

7 / 40

Page 8: Marginalized Samplers for Normalized Random Measure ...teh/research/npbayes/FavTeh...k for k = 1;:::;M. k ˘H I Drawn before each Gibbs update. I Exists only during the Gibbs update,

Completely Random MeasuresI A completely random measure (CRM) ν is a random measure

such thatν(A)⊥⊥ν(B)

whenever A and B are disjoint sets.I The CRM can always be decomposed into 3 components:

ν = ν0︸︷︷︸fixed measure

+∞∑

j=1

vjδη∗j︸ ︷︷ ︸fixed atoms

+∞∑

k=1

wkδφ∗k︸ ︷︷ ︸random atoms

I ν0, {η∗j } are not random,I {vj} are mutually independent and independent of {wk , φ

∗k},

I {(wk , φ∗k )} is drawn from a Poisson process over R+ × Φ with rate

measure ρ(w , φ)dwdφ (the Lévy measure).I In most modelling applications, only require third component.

[Kingman 1967] 8 / 40

Page 9: Marginalized Samplers for Normalized Random Measure ...teh/research/npbayes/FavTeh...k for k = 1;:::;M. k ˘H I Drawn before each Gibbs update. I Exists only during the Gibbs update,

Completely Random Measures

w

φρ(w,φ)

9 / 40

Page 10: Marginalized Samplers for Normalized Random Measure ...teh/research/npbayes/FavTeh...k for k = 1;:::;M. k ˘H I Drawn before each Gibbs update. I Exists only during the Gibbs update,

The Lévy Measure

ν =∞∑

k=1

wkδφ∗k {(wk , φ∗k )} ∼ Poisson(ρ)

I Homogeneous CRMs have ρ(w , φ) = ρ(w)h(φ), implies:

{wk} ∼ Poisson(ρ) φ∗k ∼ H

I Want ν to have infinitely many atoms:

⇒∫ ∞

0ρ(w)dw =∞

I Want ν to have finite total mass:

⇒∫ ∞

0(1− e−w )ρ(w)dw <∞

0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 50

0.5

1

1.5

2

2.5

3

10 / 40

Page 11: Marginalized Samplers for Normalized Random Measure ...teh/research/npbayes/FavTeh...k for k = 1;:::;M. k ˘H I Drawn before each Gibbs update. I Exists only during the Gibbs update,

Normalized Random Measures

I Normalizing a CRM gives a normalized random measure (NRM):

µ =ν

ν(Φ)

I µ is a random discrete probability measure.I To study random partition structure, it suffices to assume

I No fixed measure (ν0 = 0),I No fixed atoms (vi = 0),I Homogeneous NRMs ρ(w , φ) = ρ(w)h(φ).

I Examples: DP (normalized gamma process), normalized stableprocess, normalized inverse Gaussian process.

[Regazzini et al. 2003, Nieto-Barajas et al. 2004, James et al. 2009] 11 / 40

Page 12: Marginalized Samplers for Normalized Random Measure ...teh/research/npbayes/FavTeh...k for k = 1;:::;M. k ˘H I Drawn before each Gibbs update. I Exists only during the Gibbs update,

Normalized Generalized Gamma Process

I A large family is obtained by normalizing the generalized gammaprocess:

ρσ,α,τ (w) =α

Γ(1− σ)w−σ−1e−τw

I It also has power-law properties (like the Pitman-Yor).I Specializes to DP when σ = 0.I Specializes to normalised stable when τ = 0, α = σ.I Specializes to normalized inverse Gaussian process when σ = 1

2 .

[Brix 1999, James 2002] 12 / 40

Page 13: Marginalized Samplers for Normalized Random Measure ...teh/research/npbayes/FavTeh...k for k = 1;:::;M. k ˘H I Drawn before each Gibbs update. I Exists only during the Gibbs update,

NRM Hierarchical Model

ν ∼ CRM(ρ,H)

µ = ν/ν(Φ)

φi |µ ∼ µ for i = 1 . . . n

I Let φ∗1, . . . , φ∗K be the K unique values

among φ1:n, with φ∗k occurring nktimes.

I Intuitively,

ν|φ1:n = ν∗ +K∑

k=1

vkδφ∗k

p(φ1:n|ν) =

∏Kk=1 ν({φ∗k})nk h(φ∗k )

ν(Φ)n

!x x x x

! | "1:nx x x x

ν|φ1:n =K∑

k=1

vkδφ∗k + ν∗

13 / 40

Page 14: Marginalized Samplers for Normalized Random Measure ...teh/research/npbayes/FavTeh...k for k = 1;:::;M. k ˘H I Drawn before each Gibbs update. I Exists only during the Gibbs update,

Data Augmentation

p(φ1:n|ν) =

∏Kk=1 ν({φ∗k})nk h(φ∗k )

ν(Φ)n

=K∏

k=1

ν({φ∗k})nk h(φ∗k )

∫ ∞0

un−1e−uν(Φ)

Γ(n)du

I Introducing an auxiliary variable u:

p(φ1:n,u|ν)

=un−1

Γ(n)e−u(ν∗(Φ)+

∑Kk=1 vk )

K∏k=1

vnkk h(φ∗k )

!x x x x

! | "1:nx x x x

ν|φ1:n =K∑

k=1

vkδφ∗k + ν∗

[James 2002, James et al. 2009] 14 / 40

Page 15: Marginalized Samplers for Normalized Random Measure ...teh/research/npbayes/FavTeh...k for k = 1;:::;M. k ˘H I Drawn before each Gibbs update. I Exists only during the Gibbs update,

NRM Marginal Characterisation

p(φ1:n,u|ν) =un−1

Γ(n)e−uν∗(Φ)

K∏k=1

vnkk e−uvk h(φ∗k )

p(φ1:n,u) = E[p(φ1:n,u|ν)] =un−1

Γ(n)e−ψ(u)

K∏k=1

κ(u,nk )h(φ∗k )

The Laplace transform of ρ is

ψ(u) = − logE[e−uν(Φ)] =

∫ ∞0

(1− e−uw )ρ(w)dw

The mth moment of the u-exponentially tilted Lévy measure:

κ(u,m) =

∫ ∞0

wme−uwρ(w)dw

[James 2002, James et al. 2009] 15 / 40

Page 16: Marginalized Samplers for Normalized Random Measure ...teh/research/npbayes/FavTeh...k for k = 1;:::;M. k ˘H I Drawn before each Gibbs update. I Exists only during the Gibbs update,

Outline

Dirichlet Process Mixture Models

Normalized Random Measures

MCMC Samplers for NRM Mixture Models

Numerical Illustrations

16 / 40

Page 17: Marginalized Samplers for Normalized Random Measure ...teh/research/npbayes/FavTeh...k for k = 1;:::;M. k ˘H I Drawn before each Gibbs update. I Exists only during the Gibbs update,

NRM Mixture Model

ν ∼ CRM(ρ,H)

µ = ν/ν(Φ)

For i = 1, . . . ,n: φi |µ ∼ µxi |φi ∼ F (φi)

I The distinct values among φ1:n induces a partition π.I Let φ∗c be the the distinct value associated with cluster c ∈ π.I Conditional and Pólya urn based samplers have been developed

in the literature [James et al. 2009,Nieto-Barajas and Prüenster 2009, Griffin and Walker 2011,Favaro and Walker 2011, Barrios et al. 2013].

I Marginalized samplers operate on the joint distribution of π, {φ∗c}and x1:n.

17 / 40

Page 18: Marginalized Samplers for Normalized Random Measure ...teh/research/npbayes/FavTeh...k for k = 1;:::;M. k ˘H I Drawn before each Gibbs update. I Exists only during the Gibbs update,

NRM Marginal SamplerConjugate case

p(π,u, x1:n) =un−1e−ψ(u)

Γ(n)

∏c∈π

(κ(u, |c|)

∫Φ

h(φ∗c)∏i∈c

f (xi |φ∗c)dφ∗c

)

I Gibbs sampling:I Marginalize out cluster parameters {φ∗c}.I Update u given π using slice sampling or MH.I Gibbs sample cluster assignment of each item xi in turn:

p(i ∈ c|π\i , x1:n)

κ(u,|c|+1)κ(u,|c|)

∫f (xi |φ∗c)h(φ∗c |(xj)j∈c)dφ∗c for c ∈ π\i ,

κ(u,1)

∫f (xi |φ∗c)h(φ∗c)dφ∗c for c = new cluster.

18 / 40

Page 19: Marginalized Samplers for Normalized Random Measure ...teh/research/npbayes/FavTeh...k for k = 1;:::;M. k ˘H I Drawn before each Gibbs update. I Exists only during the Gibbs update,

NRM Marginal SamplerConjugate case

I Normalised generalised gamma processes:

p(i ∈ c|π\i , x1:n)

(|c| − σ)

∫f (xi |φ∗c)h(φ∗c |(xj)j∈c)dφ∗c for c ∈ π\i ,

α(u + τ)σ∫

f (xi |φ∗c)h(φ∗c)dφ∗c for c = new cluster.

19 / 40

Page 20: Marginalized Samplers for Normalized Random Measure ...teh/research/npbayes/FavTeh...k for k = 1;:::;M. k ˘H I Drawn before each Gibbs update. I Exists only during the Gibbs update,

NRM Marginal SamplerNon-conjugate case

I Cannot marginalize out cluster parameters {φ∗c}.

p(π,u, {φ∗c}, x1:n) ∝ un−1e−ψ(u)

Γ(n)

∏c∈π

(κ(u, |c|)h(φ∗c)

∏i∈c

f (xi |φ∗c)

)

I Gibbs sample cluster assignment of item xi :

p(i ∈ c|π\i , x1:n) ∝

κ(u,|c|+1)κ(u,|c|) f (xi |φ∗c) for c ∈ π\i ,

κ(u,1)

∫f (xi |φ∗c)h(φ∗c)dφ∗c for c = new cluster.

I Integral for new clusters expensive to evaluate.I When singleton cluster is emptied, parameter is discarded.

20 / 40

Page 21: Marginalized Samplers for Normalized Random Measure ...teh/research/npbayes/FavTeh...k for k = 1;:::;M. k ˘H I Drawn before each Gibbs update. I Exists only during the Gibbs update,

NRM Marginal SamplerNeal’s Algorithm 8

I Framed as a data augmentation scheme:I Introduce M “new” clusters, with parameters ψk for k = 1, . . . ,M.

ψk ∼ H

I Drawn before each Gibbs update.I Exists only during the Gibbs update, and discarded afterwards.I The parameter of an emptied cluster is used as the parameter for

one of the new cluster.

p(i ∈ c|π\i , x1:n) ∝

κ(u, |c|+ 1)

κ(u, |c|)f (xi |φ∗c) for c ∈ π\i ,

κ(u,1)

Mf (xi |ψk ) for c = k ∈ {1, . . . ,M}.

[Neal 2000] 21 / 40

Page 22: Marginalized Samplers for Normalized Random Measure ...teh/research/npbayes/FavTeh...k for k = 1;:::;M. k ˘H I Drawn before each Gibbs update. I Exists only during the Gibbs update,

NRM Marginal SamplerNeal’s Algorithm 8

xi

n1 : 5

φ∗1

n2 : 3

φ∗2

n3 : 1

φ∗3

22 / 40

Page 23: Marginalized Samplers for Normalized Random Measure ...teh/research/npbayes/FavTeh...k for k = 1;:::;M. k ˘H I Drawn before each Gibbs update. I Exists only during the Gibbs update,

NRM Marginal SamplerNeal’s Algorithm 8

xi

n1 : 5

φ∗1

n2 : 3

φ∗2

n3 : 0

φ∗3

22 / 40

Page 24: Marginalized Samplers for Normalized Random Measure ...teh/research/npbayes/FavTeh...k for k = 1;:::;M. k ˘H I Drawn before each Gibbs update. I Exists only during the Gibbs update,

NRM Marginal SamplerNeal’s Algorithm 8

xi

n1 : 5

φ∗1

n2 : 3

φ∗2φ∗3 ψ3ψ2 ψ4

22 / 40

Page 25: Marginalized Samplers for Normalized Random Measure ...teh/research/npbayes/FavTeh...k for k = 1;:::;M. k ˘H I Drawn before each Gibbs update. I Exists only during the Gibbs update,

NRM Marginal SamplerNeal’s Algorithm 8

xi

n1 : 5

φ∗1

n2 : 3

φ∗2φ∗3 ψ3ψ2 ψ4

22 / 40

Page 26: Marginalized Samplers for Normalized Random Measure ...teh/research/npbayes/FavTeh...k for k = 1;:::;M. k ˘H I Drawn before each Gibbs update. I Exists only during the Gibbs update,

NRM Marginal SamplerNeal’s Algorithm 8

xi

n1 : 5

φ∗1

n2 : 3

φ∗2φ∗3 ψ3ψ2 ψ4

22 / 40

Page 27: Marginalized Samplers for Normalized Random Measure ...teh/research/npbayes/FavTeh...k for k = 1;:::;M. k ˘H I Drawn before each Gibbs update. I Exists only during the Gibbs update,

NRM Marginal SamplerNeal’s Algorithm 8

xi

n1 : 5

φ∗1

n2 : 3

φ∗2

n3 : 1

ψ2

22 / 40

Page 28: Marginalized Samplers for Normalized Random Measure ...teh/research/npbayes/FavTeh...k for k = 1;:::;M. k ˘H I Drawn before each Gibbs update. I Exists only during the Gibbs update,

NRM Marginal SamplerReuse Algorithm

I Computationally expensive to generate many parameters frombase distribution h.

I Would like to somehow reuse unused parameters.I A transdimensional algorithm:

I Augment state space permanently with M new clusters.I Reversible jump Metropolis-Hastings updates [Green 1995].

23 / 40

Page 29: Marginalized Samplers for Normalized Random Measure ...teh/research/npbayes/FavTeh...k for k = 1;:::;M. k ˘H I Drawn before each Gibbs update. I Exists only during the Gibbs update,

NRM Marginal SamplerReuse Algorithm

xi

n1 : 5

φ∗1

n2 : 3

φ∗2

n3 : 1

φ∗3ψ3ψ2 ψ4ψ1

I Augment state space with M new clusters.I Unassign xi ; if current cluster is a singleton,

I Replace the parameter of a randomly chosen new cluster with itsparameter.

I Reassign cluster assignment of xi .I If xi is assigned to a new cluster,

I Create a cluster with the parameter,I Generate a new parameter from base distribution.

I Acceptance probability always one.24 / 40

Page 30: Marginalized Samplers for Normalized Random Measure ...teh/research/npbayes/FavTeh...k for k = 1;:::;M. k ˘H I Drawn before each Gibbs update. I Exists only during the Gibbs update,

NRM Marginal SamplerReuse Algorithm

xi

n1 : 5

φ∗1

n2 : 3

φ∗2

n3 : 0

φ∗3ψ3ψ2 ψ4ψ1

I Augment state space with M new clusters.I Unassign xi ; if current cluster is a singleton,

I Replace the parameter of a randomly chosen new cluster with itsparameter.

I Reassign cluster assignment of xi .I If xi is assigned to a new cluster,

I Create a cluster with the parameter,I Generate a new parameter from base distribution.

I Acceptance probability always one.24 / 40

Page 31: Marginalized Samplers for Normalized Random Measure ...teh/research/npbayes/FavTeh...k for k = 1;:::;M. k ˘H I Drawn before each Gibbs update. I Exists only during the Gibbs update,

NRM Marginal SamplerReuse Algorithm

xi

n1 : 5

φ∗1

n2 : 3

φ∗2φ∗3 ψ3 ψ4ψ1

I Augment state space with M new clusters.I Unassign xi ; if current cluster is a singleton,

I Replace the parameter of a randomly chosen new cluster with itsparameter.

I Reassign cluster assignment of xi .I If xi is assigned to a new cluster,

I Create a cluster with the parameter,I Generate a new parameter from base distribution.

I Acceptance probability always one.24 / 40

Page 32: Marginalized Samplers for Normalized Random Measure ...teh/research/npbayes/FavTeh...k for k = 1;:::;M. k ˘H I Drawn before each Gibbs update. I Exists only during the Gibbs update,

NRM Marginal SamplerReuse Algorithm

xi

n1 : 5

φ∗1

n2 : 3

φ∗2ψ3 ψ4φ∗3ψ1

I Augment state space with M new clusters.I Unassign xi ; if current cluster is a singleton,

I Replace the parameter of a randomly chosen new cluster with itsparameter.

I Reassign cluster assignment of xi .I If xi is assigned to a new cluster,

I Create a cluster with the parameter,I Generate a new parameter from base distribution.

I Acceptance probability always one.24 / 40

Page 33: Marginalized Samplers for Normalized Random Measure ...teh/research/npbayes/FavTeh...k for k = 1;:::;M. k ˘H I Drawn before each Gibbs update. I Exists only during the Gibbs update,

NRM Marginal SamplerReuse Algorithm

xi

n1 : 5

φ∗1

n2 : 3

φ∗2ψ3 ψ4φ∗3ψ1

I Augment state space with M new clusters.I Unassign xi ; if current cluster is a singleton,

I Replace the parameter of a randomly chosen new cluster with itsparameter.

I Reassign cluster assignment of xi .I If xi is assigned to a new cluster,

I Create a cluster with the parameter,I Generate a new parameter from base distribution.

I Acceptance probability always one.24 / 40

Page 34: Marginalized Samplers for Normalized Random Measure ...teh/research/npbayes/FavTeh...k for k = 1;:::;M. k ˘H I Drawn before each Gibbs update. I Exists only during the Gibbs update,

NRM Marginal SamplerReuse Algorithm

xi

n1 : 5

φ∗1

n2 : 3

φ∗2φ∗3 ψ3 ψ4

n3 : 1

ψ1

I Augment state space with M new clusters.I Unassign xi ; if current cluster is a singleton,

I Replace the parameter of a randomly chosen new cluster with itsparameter.

I Reassign cluster assignment of xi .I If xi is assigned to a new cluster,

I Create a cluster with the parameter,I Generate a new parameter from base distribution.

I Acceptance probability always one.24 / 40

Page 35: Marginalized Samplers for Normalized Random Measure ...teh/research/npbayes/FavTeh...k for k = 1;:::;M. k ˘H I Drawn before each Gibbs update. I Exists only during the Gibbs update,

NRM Marginal SamplerReuse Algorithm

xi

n1 : 5

φ∗1

n2 : 3

φ∗2φ∗3 ψ3 ψ4

n3 : 1

ψ1ψ�

1

I Augment state space with M new clusters.I Unassign xi ; if current cluster is a singleton,

I Replace the parameter of a randomly chosen new cluster with itsparameter.

I Reassign cluster assignment of xi .I If xi is assigned to a new cluster,

I Create a cluster with the parameter,I Generate a new parameter from base distribution.

I Acceptance probability always one.24 / 40

Page 36: Marginalized Samplers for Normalized Random Measure ...teh/research/npbayes/FavTeh...k for k = 1;:::;M. k ˘H I Drawn before each Gibbs update. I Exists only during the Gibbs update,

NRM Marginal SamplerReuse Algorithm

xi

n1 : 5

φ∗1

n2 : 3

φ∗2φ∗3 ψ3 ψ4

n3 : 1

ψ1ψ�

1

I Augment state space with M new clusters.I Unassign xi ; if current cluster is a singleton,

I Replace the parameter of a randomly chosen new cluster with itsparameter.

I Reassign cluster assignment of xi .I If xi is assigned to a new cluster,

I Create a cluster with the parameter,I Generate a new parameter from base distribution.

I Acceptance probability always one.24 / 40

Page 37: Marginalized Samplers for Normalized Random Measure ...teh/research/npbayes/FavTeh...k for k = 1;:::;M. k ˘H I Drawn before each Gibbs update. I Exists only during the Gibbs update,

Outline

Dirichlet Process Mixture Models

Normalized Random Measures

MCMC Samplers for NRM Mixture Models

Numerical Illustrations

25 / 40

Page 38: Marginalized Samplers for Normalized Random Measure ...teh/research/npbayes/FavTeh...k for k = 1;:::;M. k ˘H I Drawn before each Gibbs update. I Exists only during the Gibbs update,

NRM Mixture of Normals

I One-dimensional examples:I Galaxy (n = 82)I Acidity (n = 155)

I Multi-dimensional examples:I Old Faithful, (p = 2,n = 272)I Neural spike sorting, (p = 6,n = 1000,2000)

[Richardson and Green 1997]

I Non-conjugate prior over mean and covariance of normals:

m ∼ N (m0,S0) Σ ∼ IW(α0,Σ0)

I Hierarchical prior for Σ0 ∼ IW(β0, γ0S0).I Weakly informative, using prior knowledge of data range.I In 1D case reduces to prior used in [Richardson and Green 1997].

26 / 40

Page 39: Marginalized Samplers for Normalized Random Measure ...teh/research/npbayes/FavTeh...k for k = 1;:::;M. k ˘H I Drawn before each Gibbs update. I Exists only during the Gibbs update,

Efficiency Evaluation

I 10000 iterations burn-in, 10000 samples collected from 200000iterations.

I Effective sample size of number of clusters K using Coda.I Reports mean ESS and standard error over 10 repeats.I Compared:

I Conjugate marginalized samplerI Neal’s Algorithm 8 marginalized samplerI Reuse Algorithm marginalized samplerI Slice sampler based on posterior representation

I Variation on [Griffin and Walker 2011]I Truncation required.

27 / 40

Page 40: Marginalized Samplers for Normalized Random Measure ...teh/research/npbayes/FavTeh...k for k = 1;:::;M. k ˘H I Drawn before each Gibbs update. I Exists only during the Gibbs update,

Galaxy DatasetPredictive Density Co−clustering

0

0.2

0.4

0.6

0.8

1

0 0.2 0.4 0.6 0.8 10

200

400

600

800

σ

−8 −6 −4 −2 0 20

500

1000

1500log(a)

−7 −6 −5 −4 −3 −20

200

400

600

800

1000

1200

log(β0)

28 / 40

Page 41: Marginalized Samplers for Normalized Random Measure ...teh/research/npbayes/FavTeh...k for k = 1;:::;M. k ˘H I Drawn before each Gibbs update. I Exists only during the Gibbs update,

Acidity DatasetPredictive Density Co−clustering

0

0.2

0.4

0.6

0.8

1

0 0.2 0.4 0.6 0.80

200

400

600

800

1000

σ

−8 −6 −4 −2 0 20

500

1000

1500

2000log(a)

−6 −4 −2 0 20

200

400

600

800

1000

1200

log(β0)

29 / 40

Page 42: Marginalized Samplers for Normalized Random Measure ...teh/research/npbayes/FavTeh...k for k = 1;:::;M. k ˘H I Drawn before each Gibbs update. I Exists only during the Gibbs update,

Comparative ResultsGalaxy and Acidity datasets (conjugate model)

Sampler Galaxy AcidityRuntime (s) ESS Runtime (s) ESS

Cond Slice 239.1 ± 4.2 2004 ± 178 196.5 ± 1.0 910 ± 142Marg (C = 1) 215.7 ± 1.4 7809 ± 87 395.5 ± 1.7 5236 ± 181Cond Slice 133.0 ± 3.2 1594 ± 117 77.4 ± 0.7 1099 ± 49Marg Neal 8 (C=1) 74.4 ± 0.6 5815 ± 145 133.3 ± 1.8 4175 ± 85Marg Neal 8 (C=2) 87.9 ± 0.6 6292 ± 94 163.8 ± 1.5 4052 ± 158Marg Neal 8 (C=3) 101.9 ± 0.7 6320 ± 137 188.2 ± 1.1 4241 ± 99Marg Neal 8 (C=4) 115.9 ± 0.6 6283 ± 86 216.6 ± 1.7 4266 ± 122Marg Neal 8 (C=5) 130.0 ± 0.6 6491 ± 203 243.8 ± 2.0 4453 ± 123Marg Reuse (C=1) 64.3 ± 0.3 4451 ± 79 114.6 ± 2.0 3751 ± 65Marg Reuse (C=2) 67.6 ± 0.5 5554 ± 112 123.1 ± 1.9 4475 ± 110Marg Reuse (C=3) 71.3 ± 0.5 5922 ± 157 128.2 ± 2.2 4439 ± 158Marg Reuse (C=4) 74.9 ± 0.5 6001 ± 101 140.1 ± 1.6 4543 ± 108Marg Reuse (C=5) 78.7 ± 0.6 6131 ± 124 147.7 ± 1.5 4585 ± 116

30 / 40

Page 43: Marginalized Samplers for Normalized Random Measure ...teh/research/npbayes/FavTeh...k for k = 1;:::;M. k ˘H I Drawn before each Gibbs update. I Exists only during the Gibbs update,

Comparative ResultsGalaxy and Acidity datasets (non-conjugate model)

Sampler Galaxy AcidityRuntime (s) ESS Runtime (s) ESS

Cond Slice 75.5 ± 1.2 939 ± 92 50.9 ± 0.5 949 ± 70Marg Neal 8 (C=1) 65.0 ± 0.5 4313 ± 172 110.9 ± 0.8 4144 ± 64Marg Neal 8 (C=2) 78.6 ± 0.4 4831 ± 168 139.2 ± 1.8 4290 ± 125Marg Neal 8 (C=3) 92.5 ± 0.5 4785 ± 97 162.7 ± 0.9 4368 ± 72Marg Neal 8 (C=4) 106.3 ± 0.5 4849 ± 120 187.6 ± 1.1 4234 ± 142Marg Neal 8 (C=5) 119.7 ± 0.6 5029 ± 89 215.4 ± 1.3 4144 ± 213Marg Reuse (C=1) 55.2 ± 0.5 3830 ± 103 91.3 ± 0.9 4007 ± 122Marg Reuse (C=2) 58.7 ± 0.5 4286 ± 101 98.1 ± 0.9 4192 ± 138Marg Reuse (C=3) 62.4 ± 0.6 4478 ± 124 105.1 ± 0.9 4260 ± 136Marg Reuse (C=4) 66.1 ± 0.5 4825 ± 63 112.3 ± 1.0 4191 ± 139Marg Reuse (C=5) 69.8 ± 0.6 4755 ± 141 121.0 ± 1.8 4186 ± 121

31 / 40

Page 44: Marginalized Samplers for Normalized Random Measure ...teh/research/npbayes/FavTeh...k for k = 1;:::;M. k ˘H I Drawn before each Gibbs update. I Exists only during the Gibbs update,

Comparative ResultsOld Faithful and spike sorting datasets (non-conjugate model)

Sampler Old Faithful Spike SortingRuntime (s) ESS Runtime (s) ESS

Cond Slice 142.6 ± 1.1 574 ± 36 732.6 ± 8.1 17.1 ± 2.3Marg Reuse (C=1) 208.0 ± 1.3 2770 ± 209 1120.3 ± 8.8 35.7 ± 2.4Marg Reuse (C=2) 225.3 ± 1.4 3236 ± 73 1164.5 ± 5.4 46.9 ± 2.9Marg Reuse (C=3) 241.5 ± 1.3 3148 ± 71 1204.1 ± 7.3 57.0 ± 3.9Marg Reuse (C=4) 257.7 ± 1.7 3291 ± 145 1238.5 ± 7.8 61.4 ± 3.3Marg Reuse (C=5) 274.8 ± 1.7 3144 ± 70 1291.8 ± 7.9 69.8 ± 4.9Marg Reuse (C=10) 356.3 ± 2.5 3080 ± 135 1513.8 ± 11.9 90.8 ± 5.6Marg Reuse (C=15) 446.6 ± 4.9 3312 ± 154 1746.3 ± 10.7 95.9 ± 4.2Marg Reuse (C=20) 550.4 ± 3.5 3336 ± 109 1944.0 ± 14.7 114.5 ± 8.4

32 / 40

Page 45: Marginalized Samplers for Normalized Random Measure ...teh/research/npbayes/FavTeh...k for k = 1;:::;M. k ˘H I Drawn before each Gibbs update. I Exists only during the Gibbs update,

Spike Sorting Dataset

1

23

4

5

6

7

8

7a

7b

33 / 40

Page 46: Marginalized Samplers for Normalized Random Measure ...teh/research/npbayes/FavTeh...k for k = 1;:::;M. k ˘H I Drawn before each Gibbs update. I Exists only during the Gibbs update,

Spike Sorting Dataset

1: size = 319

2: size = 62

3: size = 40

4: size = 270

5: size = 44

6: size = 306

7: size = 826

7a: size = 267

7b: size = 525

8: size = 123

34 / 40

Page 47: Marginalized Samplers for Normalized Random Measure ...teh/research/npbayes/FavTeh...k for k = 1;:::;M. k ˘H I Drawn before each Gibbs update. I Exists only during the Gibbs update,

Conclusion

I Marginalised samplers for NRMs more efficient than conditionalslice samplers.

I Simple algorithms, introducing an additional auxiliary variable u.

I Pitman-Yor processes are not normalised random measures.I Marginalised samplers for all σ-stable Poisson-Kingman mixture

models (including Pitman-Yor) (Lomeli et al).I Motivation for normalised random measures?

I Power-law propertiesI Dependent normalised random measures

35 / 40

Page 48: Marginalized Samplers for Normalized Random Measure ...teh/research/npbayes/FavTeh...k for k = 1;:::;M. k ˘H I Drawn before each Gibbs update. I Exists only during the Gibbs update,

Normalized Gamma Process

I A gamma process is a CRM with Lévy measure

ρα,τ (w , φ) = αw−1e−τwh(φ)

I The gamma process has gamma marginals:

ν(A) ∼ Gamma(α

∫A

h(φ)dφ, τ)

I Normalizing a gamma process gives a Dirichlet process, withmass parameter α and base distribution H with density h.

36 / 40

Page 49: Marginalized Samplers for Normalized Random Measure ...teh/research/npbayes/FavTeh...k for k = 1;:::;M. k ˘H I Drawn before each Gibbs update. I Exists only during the Gibbs update,

Normalized Stable Process

I A stable process is a CRM with Lévy measure

ρσ(w) =σ

Γ(1− σ)w−σ−1

I It has positive stable marginals with index σ.I Normalizing a stable process, and deriving the induced

exchangeable partition process, leads to the following randompartition:

I Customer 1 sits at first table.I Subsequent customer n + 1:

I sits at table c with probability |c|−σn ,

I sits at new table with probability |π|σn .

I Related to the two-parameter Poisson-Dirichlet process (akaPitman-Yor process).

[Perman et al. 1992, Pitman and Yor 1997] 37 / 40

Page 50: Marginalized Samplers for Normalized Random Measure ...teh/research/npbayes/FavTeh...k for k = 1;:::;M. k ˘H I Drawn before each Gibbs update. I Exists only during the Gibbs update,

NRM Posterior Characterisation

ν|φ1:n,u = ν∗ +K∑

k=1

vkδφ∗k

p(φ1:n,u|ν) =un−1e−uν∗(Φ)

Γ(n)

K∏k=1

vnkk e−uvk

ν∗|φ1:n∼ CRM(ρ∗,H)

ρ∗(w , φ)= e−uwρ(w)

p(vk |φ1:n,u)∝ vnkk e−uvkρ(vk )

!x x x x

! | "1:nx x x x

[James et al. 2009] 38 / 40

Page 51: Marginalized Samplers for Normalized Random Measure ...teh/research/npbayes/FavTeh...k for k = 1;:::;M. k ˘H I Drawn before each Gibbs update. I Exists only during the Gibbs update,

References II Antoniak, C. E. (1974). Mixtures of Dirichlet processes with applications to Bayesian nonparametric problems. Annals of

Statistics, 2(6):1152–1174.

I Barrios, E., Lijoi, A., Nieto-Barajas, L. E., and Prüenster, I. (2013). Modeling with normalized random measure mixturemodels. Unpublished manuscript.

I Brix, A. (1999). Generalized gamma measures and shot-noise Cox processes. Advances in Applied Probability, 31:929–953.

I Bush, C. A. and MacEachern, S. N. (1996). A semiparametric Bayesian model for randomised block designs. Biometrika,83:275–285.

I Escobar, M. D. and West, M. (1995). Bayesian density estimation and inference using mixtures. Journal of the AmericanStatistical Association, 90:577–588.

I Favaro, S. and Walker, S. G. (2011). Slice sampling σ–stable Poisson–Kingman mixture models.

I Ferguson, T. S. (1973). A Bayesian analysis of some nonparametric problems. Annals of Statistics, 1(2):209–230.

I Green, P. J. (1995). Reversible jump Markov chain Monte Carlo computation and Bayesian model determination. Biometrika,82(4):711–732.

I Griffin, J. E. and Walker, S. G. (2011). Posterior simulation of normalized random measure mixtures. Journal ofComputational and Graphical Statistics, 20(1):241–259.

I Ishwaran, H. and James, L. F. (2001). Gibbs sampling methods for stick-breaking priors. Journal of the American StatisticalAssociation, 96(453):161–173.

I James, L. F. (2002). Poisson process partition calculus with applications to exchangeable models and bayesiannonparametrics.

I James, L. F., Lijoi, A., and Prüenster, I. (2009). Posterior analysis for normalized random measures with independentincrements. Scandinavian Journal of Statistics, 36:76–97.

I Kalli, M. and Walker, S. G. (2006). Slice sampling for the Dirichlet process mixture model. Poster presented at the EighthValencia International Meeting on Bayesian Statistics.

I Kingman, J. F. C. (1967). Completely random measures. Pacific Journal of Mathematics, 21(1):59–78.

39 / 40

Page 52: Marginalized Samplers for Normalized Random Measure ...teh/research/npbayes/FavTeh...k for k = 1;:::;M. k ˘H I Drawn before each Gibbs update. I Exists only during the Gibbs update,

References II

I Lo, A. (1984). On a class of bayesian nonparametric estimates: I. density estimates. Annals of Statistics, 12(1):351–357.

I Neal, R. M. (2000). Markov chain sampling methods for Dirichlet process mixture models. Journal of Computational andGraphical Statistics, 9:249–265.

I Nieto-Barajas, L. E. and Prüenster, I. (2009). A sensitivity analysis for Bayesian nonparametric density estimators. StatisticaSinica, 19:685–705.

I Nieto-Barajas, L. E., Pruenster, I., and Walker, S. G. (2004). Normalized random measures driven by increasing additiveprocesses. Annals of Statistics, 32(6):2343–2360.

I Papaspiliopoulos, O. and Roberts, G. O. (2008). Retrospective Markov chain Monte Carlo methods for Dirichlet processhierarchical models. Biometrika, 95(1):169–186.

I Perman, M., Pitman, J., and Yor, M. (1992). Size-biased sampling of Poisson point processes and excursions. ProbabilityTheory and Related Fields, 92(1):21–39.

I Pitman, J. and Yor, M. (1997). The two-parameter Poisson-Dirichlet distribution derived from a stable subordinator. Annals ofProbability, 25:855–900.

I Regazzini, E., Lijoi, A., and Prüenster, I. (2003). Distributional results for means of random measures with independentincrements. Annals of Statistics, 31:560–585.

I Richardson, S. and Green, P. J. (1997). On Bayesian analysis of mixtures with an unknown number of components. Journalof the Royal Statistical Society B, 59(4):731–792.

I Walker, S. G. (2007). Sampling the Dirichlet mixture model with slices. Communications in Statistics - Simulation andComputation, 36:45.

40 / 40