New Bayesian inference for partially observed Markov processes,...
Transcript of New Bayesian inference for partially observed Markov processes,...
Stochastic modelling of dynamical systemsBayesian inference
Particle MCMCSummary and conclusions
Bayesian inference for partially observed Markovprocesses, with application to systems biology
Darren Wilkinsonhttp://tinyurl.com/darrenjw
School of Mathematics & Statistics, Newcastle University, UK
Bayes–250Informatics Forum
Edinburgh, UK5th–7th September, 2011
Darren Wilkinson — Bayes–250, Edinburgh, 5/9/2011 Bayesian inference for POMP models using pMCMC
Stochastic modelling of dynamical systemsBayesian inference
Particle MCMCSummary and conclusions
Systems biology modelsPopulation dynamicsStochastic chemical kineticsGenetic autoregulation
Systems biology modelling
Uses accurate high-resolution time-course data on a relativelysmall number of bio-molecules to parametrise carefullyconstructed mechanistic dynamic models of a process ofinterest based on current biological understanding
Traditionally, models were typically deterministic, based on asystem of ODEs known as the Reaction Rate Equations(RREs)
It is now increasingly accepted that biochemical networkdynamics at the single-cell level are intrinsically stochastic
The theory of stochastic chemical kinetics provides a solidfoundation for describing network dynamics using a Markovjump process
Darren Wilkinson — Bayes–250, Edinburgh, 5/9/2011 Bayesian inference for POMP models using pMCMC
Stochastic modelling of dynamical systemsBayesian inference
Particle MCMCSummary and conclusions
Systems biology modelsPopulation dynamicsStochastic chemical kineticsGenetic autoregulation
Stochastic Chemical Kinetics
Stochastic molecular approach:
Statistical mechanical arguments lead to a Markov jumpprocess in continuous time whose instantaneous reaction ratesare directly proportional to the number of molecules of eachreacting species
Such dynamics can be simulated (exactly) on a computerusing standard discrete-event simulation techniques
Standard implementation of this strategy is known as the“Gillespie algorithm” (just discrete event simulation), butthere are several exact and approximate variants of this basicapproach
Darren Wilkinson — Bayes–250, Edinburgh, 5/9/2011 Bayesian inference for POMP models using pMCMC
Stochastic modelling of dynamical systemsBayesian inference
Particle MCMCSummary and conclusions
Systems biology modelsPopulation dynamicsStochastic chemical kineticsGenetic autoregulation
Lotka-Volterra system
Trivial (familiar) example from population dynamics (in reality, the“reactions” will be elementary biochemical reactions taking placeinside a cell)
Reactions
X −→ 2X (prey reproduction)
X + Y −→ 2Y (prey-predator interaction)
Y −→ ∅ (predator death)
X – Prey, Y – Predator
We can re-write this using matrix notation
Darren Wilkinson — Bayes–250, Edinburgh, 5/9/2011 Bayesian inference for POMP models using pMCMC
Stochastic modelling of dynamical systemsBayesian inference
Particle MCMCSummary and conclusions
Systems biology modelsPopulation dynamicsStochastic chemical kineticsGenetic autoregulation
Forming the matrix representation
The L-V system in tabular form
Rate Law LHS RHS Net-effecth(·, c) X Y X Y X Y
R1 c1x 1 0 2 0 1 0R2 c2xy 1 1 0 2 -1 1R3 c3y 0 1 0 0 0 -1
Call the 3× 2 net-effect (or reaction) matrix N. The matrixS = N ′ is the stoichiometry matrix of the system. Typically bothare sparse. The SVD of S (or N) is of interest for structuralanalysis of the system dynamics...
Darren Wilkinson — Bayes–250, Edinburgh, 5/9/2011 Bayesian inference for POMP models using pMCMC
Stochastic modelling of dynamical systemsBayesian inference
Particle MCMCSummary and conclusions
Systems biology modelsPopulation dynamicsStochastic chemical kineticsGenetic autoregulation
Stochastic chemical kinetics
u species: X1, . . . ,Xu, and v reactions: R1, . . . ,Rv
Ri : pi1X1 + · · ·+piuXu −→ qi1X1 + · · ·+qiuXu, i = 1, . . . , v
In matrix form: PX −→ QX (P and Q are sparse)
S = (Q − P)′ is the stoichiometry matrix of the system
Xjt : # molecules of Xj at time t. Xt = (X1t , . . . ,Xut)′
Reaction Ri has hazard (or rate law, or propensity) hi (Xt , ci ),where ci is a rate parameter, c = (c1, . . . , cv )′,h(Xt , c) = (h1(Xt , c1), . . . , hv (Xt , cv ))′ and the system evolvesas a Markov jump process
For mass-action stochastic kinetics,
hi (Xt , ci ) = ci
u∏j=1
(Xjt
pij
), i = 1, . . . , v
Darren Wilkinson — Bayes–250, Edinburgh, 5/9/2011 Bayesian inference for POMP models using pMCMC
Stochastic modelling of dynamical systemsBayesian inference
Particle MCMCSummary and conclusions
Systems biology modelsPopulation dynamicsStochastic chemical kineticsGenetic autoregulation
The Lotka-Volterra model
Time
[ Y ]
0 20 40 60 80 100
05
1015
[Y1][Y2]
0 2 4 6 8
05
1015
2025
[Y1]
[Y2]
Time
Y
0 5 10 15 20 25
100
200
300
400 Y1
Y2
50 100 150 200 250 300 350
100
200
300
400
Y1
Y2
Darren Wilkinson — Bayes–250, Edinburgh, 5/9/2011 Bayesian inference for POMP models using pMCMC
Stochastic modelling of dynamical systemsBayesian inference
Particle MCMCSummary and conclusions
Systems biology modelsPopulation dynamicsStochastic chemical kineticsGenetic autoregulation
Example — genetic auto-regulation
p g
r
DNA
RNAP
q
P
P2
Darren Wilkinson — Bayes–250, Edinburgh, 5/9/2011 Bayesian inference for POMP models using pMCMC
Stochastic modelling of dynamical systemsBayesian inference
Particle MCMCSummary and conclusions
Systems biology modelsPopulation dynamicsStochastic chemical kineticsGenetic autoregulation
Biochemical reactions
Simplified view:
Reactions
g + P2 ←→ g · P2 Repressiong −→ g + r Transcriptionr −→ r + P Translation2P ←→ P2 Dimerisationr −→ ∅ mRNA degradationP −→ ∅ Protein degradation
Darren Wilkinson — Bayes–250, Edinburgh, 5/9/2011 Bayesian inference for POMP models using pMCMC
Stochastic modelling of dynamical systemsBayesian inference
Particle MCMCSummary and conclusions
Systems biology modelsPopulation dynamicsStochastic chemical kineticsGenetic autoregulation
Simulated realisation of the auto-regulatory network
0.0
0.5
1.0
1.5
2.0
Rna
010
3050
P
020
040
060
0
0 1000 2000 3000 4000 5000
P2
Time
Darren Wilkinson — Bayes–250, Edinburgh, 5/9/2011 Bayesian inference for POMP models using pMCMC
Stochastic modelling of dynamical systemsBayesian inference
Particle MCMCSummary and conclusions
Partially observed Markov process (POMP) modelsBayesian inferenceLikelihood-free algorithms for stochastic model calibration
Partially observed Markov process (POMP) models
Continuous-time Markov process: X = {Xs |s ≥ 0} (for now,we suppress dependence on parameters, θ)
Think about integer time observations (extension to arbitrarytimes is trivial): for t ∈ N, Xt = {Xs |t − 1 < s ≤ t}Sample-path likelihoods such as π(xt |xt−1) can often (but notalways) be computed (but are often computationally difficult),but discrete time transitions such as π(xt |xt−1) are typicallyintractable
Partial observations: D = {dt |t = 1, 2, . . . ,T} where
dt |Xt = xt ∼ π(dt |xt), t = 1, . . . ,T ,
where we assume that π(dt |xt) can be evaluated directly(simple measurement error model)
Darren Wilkinson — Bayes–250, Edinburgh, 5/9/2011 Bayesian inference for POMP models using pMCMC
Stochastic modelling of dynamical systemsBayesian inference
Particle MCMCSummary and conclusions
Partially observed Markov process (POMP) modelsBayesian inferenceLikelihood-free algorithms for stochastic model calibration
Bayesian inference for POMP models
Most “obvious” MCMC algorithms will attempt to impute (atleast) the skeleton of the Markov process: X0,X1, . . . ,XT
This will typically require evaluation of the intractable discretetime transition likelihoods, and this is the problem...Two related strategies:
Data augmentation: “fill in” the entire process in some way,typically exploiting the fact that the sample path likelihoodsare tractable — works in principle, but difficult to “automate”,and exceptionally computationally intensive due to the need tostore and evaluate likelihoods of cts sample pathsLikelihood-free (AKA plug-and-play): exploits the fact that itis possible to forward simulate from π(xt |xt−1) (typically bysimulating from π(xt |xt−1)), even if it can’t be evaluated
Likelihood-free is really just a special kind of augmentationstrategy
Darren Wilkinson — Bayes–250, Edinburgh, 5/9/2011 Bayesian inference for POMP models using pMCMC
Stochastic modelling of dynamical systemsBayesian inference
Particle MCMCSummary and conclusions
Partially observed Markov process (POMP) modelsBayesian inferenceLikelihood-free algorithms for stochastic model calibration
Bayesian inference
Let π(x|c) denote the (complex) likelihood of the simulationmodel
Let π(D|x, τ) denote the (simple) measurement error model
Put θ = (c, τ), and let π(θ) be the prior for the modelparameters
The joint density can be written
π(θ, x,D) = π(θ)π(x|θ)π(D|x, θ).
Interest is in the posterior distribution π(θ, x|D)
Darren Wilkinson — Bayes–250, Edinburgh, 5/9/2011 Bayesian inference for POMP models using pMCMC
Stochastic modelling of dynamical systemsBayesian inference
Particle MCMCSummary and conclusions
Partially observed Markov process (POMP) modelsBayesian inferenceLikelihood-free algorithms for stochastic model calibration
Marginal MH MCMC scheme
Full model: π(θ, x,D) = π(θ)π(x|θ)π(D|x, θ)
Target: π(θ|D) (with x marginalised out)
Generic MCMC scheme:
Propose θ? ∼ f (θ?|θ)Accept with probability min{1,A}, where
A =π(θ?)
π(θ)× f (θ|θ?)
f (θ?|θ)× π(D|θ?)
π(D|θ)
π(D|θ) is the “marginal likelihood” (or “observed datalikelihood”, or...)
Darren Wilkinson — Bayes–250, Edinburgh, 5/9/2011 Bayesian inference for POMP models using pMCMC
Stochastic modelling of dynamical systemsBayesian inference
Particle MCMCSummary and conclusions
Partially observed Markov process (POMP) modelsBayesian inferenceLikelihood-free algorithms for stochastic model calibration
LF-MCMC
Posterior distribution π(θ, x|D)
Propose a joint update for θ and x as follows:
Current state of the chain is (θ, x)First sample θ? ∼ f (θ?|θ)Then sample a new path, x? ∼ π(x?|θ?)Accept the pair (θ?, x?) with probability min{1,A}, where
A =π(θ?)
π(θ)× f (θ|θ?)
f (θ?|θ)× π(D|x?, θ?)
π(D|x, θ).
Note that choosing a prior independence proposal of the formf (θ?|θ) = π(θ?) leads to the simpler acceptance ratio
A =π(D|x?, θ?)
π(D|x, θ)
Darren Wilkinson — Bayes–250, Edinburgh, 5/9/2011 Bayesian inference for POMP models using pMCMC
Stochastic modelling of dynamical systemsBayesian inference
Particle MCMCSummary and conclusions
Partially observed Markov process (POMP) modelsBayesian inferenceLikelihood-free algorithms for stochastic model calibration
“Ideal” joint MCMC scheme
LF-MCMC works by making the proposed sample pathconsistent with the proposed new parameters, butunfortunately not with the data
Ideally, we would do the joint update as follows
First sample θ? ∼ f (θ?|θ)Then sample a new path, x? ∼ π(x?|θ?,D)Accept the pair (θ?, x?) with probability min{1,A}, where
A =π(θ?)
π(θ)
π(x?|θ?)
π(x|θ)
f (θ|θ?)
f (θ?|θ)
π(D|x?, θ?)
π(D|x, θ)
π(x|D, θ)
π(x?|D, θ?)
=π(θ?)
π(θ)
π(D|θ?)
π(D|θ)
f (θ|θ?)
f (θ?|θ)
This joint scheme reduces down to the marginal scheme (Chib(1995)), but will be intractable for complex models...
Darren Wilkinson — Bayes–250, Edinburgh, 5/9/2011 Bayesian inference for POMP models using pMCMC
Stochastic modelling of dynamical systemsBayesian inference
Particle MCMCSummary and conclusions
Particle MCMC and the PMMH algorithmPMMHExample: Lotka-Volterra modelImproved filtering for SDEs
Particle MCMC (pMCMC)
Of the various alternatives, pMCMC is the only obviouspractical option for constructing global likelihood-free MCMCalgorithms which are exact (Andreiu et al, 2010)
Start by considering a basic marginal MH MCMC scheme withtarget π(θ|D) and proposal f (θ?|θ) — the acceptanceprobability is min{1,A} where
A =π(θ?)
π(θ)× f (θ|θ?)
f (θ?|θ)× π(D|θ?)
π(D|θ)
We can’t evaluate the final terms, but if we had a way toconstruct a Monte Carlo estimate of the likelihood, π(D|θ),we could just plug this in and hope for the best:
A =π(θ?)
π(θ)× f (θ|θ?)
f (θ?|θ)× π(D|θ?)
π(D|θ)
Darren Wilkinson — Bayes–250, Edinburgh, 5/9/2011 Bayesian inference for POMP models using pMCMC
Stochastic modelling of dynamical systemsBayesian inference
Particle MCMCSummary and conclusions
Particle MCMC and the PMMH algorithmPMMHExample: Lotka-Volterra modelImproved filtering for SDEs
“Exact approximate” MCMC (the pseudo-marginalapproach)
Remarkably, provided only that E[π(D|θ)] = π(D|θ), thestationary distribution of the Markov chain will be exactlycorrect (Beaumont, 2003, Andreiu & Roberts, 2009)
Putting W = π(D|θ)/π(D|θ) and augmenting the state spaceof the chain to include W , we find that the target of thechain must be
∝ π(θ)π(D|θ)π(w |θ) ∝ π(θ|D)wπ(w |θ)
and so then the above “unbiasedness” property implies thatE(W |θ) = 1, which guarantees that the marginal for θ isexactly π(θ|D)
Blog post: http://tinyurl.com/6ex4xqw
Darren Wilkinson — Bayes–250, Edinburgh, 5/9/2011 Bayesian inference for POMP models using pMCMC
Stochastic modelling of dynamical systemsBayesian inference
Particle MCMCSummary and conclusions
Particle MCMC and the PMMH algorithmPMMHExample: Lotka-Volterra modelImproved filtering for SDEs
Particle marginal Metropolis-Hastings (PMMH)
Likelihood estimates constructed via importance samplingtypically have this “unbiasedness” property, as do estimatesconstructed using a particle filter
If a particle filter is used to construct the Monte Carloestimate of likelihood to plug in to the acceptance probability,we get (a simple version of) the particle Marginal MetropolisHastings (PMMH) pMCMC algorithm
The full PMMH algorithm also uses the particle filter toconstruct a proposal for x, and has target π(θ, x|D) — notjust π(θ|D)
The (bootstrap) particle filter relies only on the ability toforward simulate from the process, and hence the entireprocedure is “likelihood-free”
Blog post: http://bit.ly/kvznmq
Darren Wilkinson — Bayes–250, Edinburgh, 5/9/2011 Bayesian inference for POMP models using pMCMC
Stochastic modelling of dynamical systemsBayesian inference
Particle MCMCSummary and conclusions
Particle MCMC and the PMMH algorithmPMMHExample: Lotka-Volterra modelImproved filtering for SDEs
Test problem: Lotka-Volterra model
Time
LV
nois
e10
0 5 10 15 20 25 30
100
200
300
400
500
x1
x2
Simulated time series data set consisting of 16 equally spacedobservations subject to Gaussian measurement error with astandard deviation of 10.
Darren Wilkinson — Bayes–250, Edinburgh, 5/9/2011 Bayesian inference for POMP models using pMCMC
Stochastic modelling of dynamical systemsBayesian inference
Particle MCMCSummary and conclusions
Particle MCMC and the PMMH algorithmPMMHExample: Lotka-Volterra modelImproved filtering for SDEs
Marginal posteriors for the Lotka-Volterra model
th1
Iteration
Va
lue
0 2000 6000 10000
0.8
51
.05
0 20 40 60 80 100
0.0
0.8
Lag
AC
F
th1 th1
Value
De
nsity
0.85 0.95 1.05
06
12
th2
Iteration
Va
lue
0 2000 6000 10000
0.0
04
4
0 20 40 60 80 100
0.0
0.8
Lag
AC
F
th2 th2
Value
De
nsity
0.0042 0.0046 0.0050 0.0054
02
00
0
th3
Iteration
Va
lue
0 2000 6000 10000
0.5
50
.70
0 20 40 60 80 100
0.0
0.8
Lag
AC
F
th3 th3
Value
De
nsity
0.55 0.60 0.65 0.70
01
02
0Note that the true parameters, θ = (1, 0.005, 0.6) are wellidentified by the data
Darren Wilkinson — Bayes–250, Edinburgh, 5/9/2011 Bayesian inference for POMP models using pMCMC
Stochastic modelling of dynamical systemsBayesian inference
Particle MCMCSummary and conclusions
Particle MCMC and the PMMH algorithmPMMHExample: Lotka-Volterra modelImproved filtering for SDEs
Marginal posteriors observing only prey
th1
Iteration
Va
lue
0 2000 6000 10000
0.7
1.1
0 20 40 60 80 100
0.0
0.8
Lag
AC
F
th1 th1
Value
De
nsity
0.7 0.9 1.1
02
4
th2
Iteration
Va
lue
0 2000 6000 10000
0.0
04
0 20 40 60 80 100
0.0
0.8
Lag
AC
F
th2 th2
Value
De
nsity
0.004 0.006
06
00
th3
Iteration
Va
lue
0 2000 6000 10000
0.4
0.8
0 20 40 60 80 100
0.0
0.8
Lag
AC
F
th3 th3
Value
De
nsity
0.4 0.6 0.8 1.0
03
6
Note that the mixing of the MCMC sampler is reasonable, andthat the true parameters, θ = (1, 0.005, 0.6) are quite wellidentified by the data
Darren Wilkinson — Bayes–250, Edinburgh, 5/9/2011 Bayesian inference for POMP models using pMCMC
Stochastic modelling of dynamical systemsBayesian inference
Particle MCMCSummary and conclusions
Particle MCMC and the PMMH algorithmPMMHExample: Lotka-Volterra modelImproved filtering for SDEs
Marginal posteriors for unknown measurement error
th1
Iteration
Va
lue
0 2000 6000 10000
0.8
01
.00
0 20 40 60 80 100
0.0
0.6
Lag
AC
F
th1 th1
Value
De
nsity
0.80 0.90 1.00 1.10
04
8
th2
Iteration
Va
lue
0 2000 6000 10000
0.0
04
5
0 20 40 60 80 100
0.0
0.6
Lag
AC
F
th2 th2
Value
De
nsity
0.0045 0.0055
01
50
0
th3
Iteration
Va
lue
0 2000 6000 10000
0.5
00
.70
0 20 40 60 80 100
0.0
0.6
Lag
AC
F
th3 th3
Value
De
nsity
0.50 0.60 0.70
01
0sd
Iteration
Va
lue
0 2000 6000 10000
10
30
50
0 20 40 60 80 100
0.0
0.6
Lag
AC
F
sd sd
Value
De
nsity
10 20 30 40 50
0.0
00
.10
Note that the mixing of the MCMC sampler is poor, and that thetrue parameters, θ = (1, 0.005, 0.6, 10) are less well identified bythe data than in the case of a fully specified measurement errormodel
Darren Wilkinson — Bayes–250, Edinburgh, 5/9/2011 Bayesian inference for POMP models using pMCMC
Stochastic modelling of dynamical systemsBayesian inference
Particle MCMCSummary and conclusions
Particle MCMC and the PMMH algorithmPMMHExample: Lotka-Volterra modelImproved filtering for SDEs
R package: smfsb
Free, open source, well-documented software package for R,smfsb, associated with the forthcoming second edition of“Stochastic modelling for systems biology”
Code for stochastic simulation and of (biochemical) reactionnetworks (Markov jump processes and chemical Langevin),and pMCMC-based Bayesian inference for POMP models
Full installation and “getting started” instructions athttp://tinyurl.com/smfsb2e
Once the package is installed and loaded, runningdemo("PMCMC") at the R prompt will run a PMMH algorithmfor the Lotka-Volterra model discussed here
Darren Wilkinson — Bayes–250, Edinburgh, 5/9/2011 Bayesian inference for POMP models using pMCMC
Stochastic modelling of dynamical systemsBayesian inference
Particle MCMCSummary and conclusions
Particle MCMC and the PMMH algorithmPMMHExample: Lotka-Volterra modelImproved filtering for SDEs
Hitting the data...
The above algorithm works well in many cases, and isextremely general (works for any Markov process)
In the case of no measurement error, the probability of hittingthe data (and accepting the proposal) is very small (possiblyzero), and so the mixing of the MCMC scheme is very poor
ABC (approximate Bayesian computation) strategy is toaccept if
‖x?t+1 − dt+1‖ < ε
but this forces a trade-off between accuracy and efficiencywhich can be unpleasant (cf. noisy ABC)
Same problem in the case of low measurement error
Particularly problematic in the context of high-dimensionaldata
Would like a strategy which copes better in this case
Darren Wilkinson — Bayes–250, Edinburgh, 5/9/2011 Bayesian inference for POMP models using pMCMC
Stochastic modelling of dynamical systemsBayesian inference
Particle MCMCSummary and conclusions
Particle MCMC and the PMMH algorithmPMMHExample: Lotka-Volterra modelImproved filtering for SDEs
The chemical Langevin equation (CLE)
The CLE is a diffusion approximation to the true Markovjump process
Start with the time change representation
Xt − X0 = S N
(∫ t
0h(Xτ , c)dτ
)and approximate Ni (t) ' t + Wi (t), where Wi (t) is anindependent Wiener process for each i
Substituting in and using a little stochastic calculus gives:
The CLE as an Ito SDE:
dXt = Sh(Xt , c) dt +√S diag{h(Xt , c)}S ′ dWt
Darren Wilkinson — Bayes–250, Edinburgh, 5/9/2011 Bayesian inference for POMP models using pMCMC
Stochastic modelling of dynamical systemsBayesian inference
Particle MCMCSummary and conclusions
Particle MCMC and the PMMH algorithmPMMHExample: Lotka-Volterra modelImproved filtering for SDEs
Improved particle filters for SDEs
The “bootstrap” particle filter uses blind forward simulationfrom the model
If we are able to evaluate the “likelihood” of sample paths, wecan use other proposals
The particle filter weights then depend on theRadon-Nikodym derivative of law of the proposed path wrtthe true conditioned process
For SDEs, the weight will degenerate unless the proposedprocess is absolutely continuous wrt the true conditionedprocess
Ideally we would like to sample from π(x?t+1|c?, x?t , dt+1), butthis is not tractable for nonlinear SDEs such as the CLE
Darren Wilkinson — Bayes–250, Edinburgh, 5/9/2011 Bayesian inference for POMP models using pMCMC
Stochastic modelling of dynamical systemsBayesian inference
Particle MCMCSummary and conclusions
Particle MCMC and the PMMH algorithmPMMHExample: Lotka-Volterra modelImproved filtering for SDEs
Modified diffusion bridge (MDB)
Need a tractable process q(x?t+1|c?, x?t , dt+1) that is locallyequivalent to π(x?t+1|c?, x?t , dt+1)
Diffusion dXt = µ(Xt)dt + β(Xt)12 dWt
The nonlinear diffusion bridge
dXt =x1 − Xt
1− tdt + β(Xt)
12 dWt
hits x1 at t = 1, yet is locally equivalent to the true diffusionas it has the same diffusion coefficient
This forms the basis of an efficient proposal; see Durham &Gallant (2002), Chib, Pitt & Shephard (2004), Delyon & Hu(2006), and Stramer & Yan (2007) for technical details
Darren Wilkinson — Bayes–250, Edinburgh, 5/9/2011 Bayesian inference for POMP models using pMCMC
Stochastic modelling of dynamical systemsBayesian inference
Particle MCMCSummary and conclusions
SummaryReferences
Summary
POMP models form a large, important and interesting class ofmodels, with many applications
It is possible, and often desirable, to develop inferentialalgorithms which are “likelihood free” or “plug-and-play”, asthis allows the separation of the modelling from the inferentialalgorithm, allowing more rapid model exploration
Many likelihood free approaches are possible, includingsequential LF-MCMC, PMMH (pMCMC), (sequential) ABCfor Bayesian inference and iterative filtering for maximumlikelihood estimation
Much work needs to be done to properly understand thestrengths and weaknesses of these competing approaches
Darren Wilkinson — Bayes–250, Edinburgh, 5/9/2011 Bayesian inference for POMP models using pMCMC
Stochastic modelling of dynamical systemsBayesian inference
Particle MCMCSummary and conclusions
SummaryReferences
Golightly, A. and D. J. Wilkinson (2008) Bayesian inference for nonlinearmultivariate diffusion models observed with error. Computational Statistics andData Analysis, 52(3), 1674–1693.
Golightly, A. and D. J. Wilkinson (2011) Bayesian parameter inference forstochastic biochemical network models using particle MCMC. In submission.
Wilkinson, D. J. (2009) Stochastic modelling for quantitative description ofheterogeneous biological systems, Nature Reviews Genetics. 10(2):122-133.
Wilkinson, D. J. (2010) Parameter inference for stochastic kinetic models ofbacterial gene regulation: a Bayesian approach to systems biology (withdiscussion), in J.-M. Bernardo et al (eds) Bayesian Statistics 9, OUP, in press.
Wilkinson, D. J. (2011) Stochastic Modelling for Systems Biology, secondedition. Chapman & Hall/CRC Press, in press.
Contact details...
email: [email protected]
www: tinyurl.com/darrenjw
Darren Wilkinson — Bayes–250, Edinburgh, 5/9/2011 Bayesian inference for POMP models using pMCMC