Bayesian Inference!!!

33
Bayesian Inference!!! Jillian Dunic and Alexa R. Wilson

description

Bayesian Inference!!!. Jillian Dunic and Alexa R. Wilson. Step One: Select your model (fixed, random, mixed). Step Two: What’s your distribution? . Step Three: What approach will you use to estimate your parameters ?. ASK: Are your true values known? - PowerPoint PPT Presentation

Transcript of Bayesian Inference!!!

Page 1: Bayesian Inference!!!

Bayesian Inference!!!

Jillian Dunic and Alexa R. Wilson

Page 2: Bayesian Inference!!!

Step One: Select your model (fixed, random, mixed)

Moment Based & Least Squares

Step Two: What’s your distribution?

Step Three: What approach will you use to estimate your parameters?

Likelihood

ASK:Are your true values known?Is your model relatively “complicated”?

Bayesian

No Yes

*gives approximation *both methods giving true estimates

Page 3: Bayesian Inference!!!

Frequentist vs. Bayes• Data are random

• Parameters are unknown constants

• P(data|parameter)

• No prior information

• In a vacuum

• Data are fixed

• Parameters are random variables

• P(parameters|data)

• Uses prior information

• Not in a vacuum

Page 4: Bayesian Inference!!!

http://imgs.xkcd.com/comics/frequentists_vs_bayesians.png

So what is Bayes?

Page 5: Bayesian Inference!!!

So what is Bayes?Bayes Theorem:

http://imgs.xkcd.com/comics/frequentists_vs_bayesians.png

Page 6: Bayesian Inference!!!

So what is Bayes?Bayes Theorem:

http://imgs.xkcd.com/comics/frequentists_vs_bayesians.png

Likelihood Prior

Page 7: Bayesian Inference!!!

Bayes• Bayes is likelihood WITH prior information

• Prior + Likelihood = Posterior(existing data) + (frequentist likelihood) = output

• Empirical Bayes: when like the frequentist approach, you assume S2 = σ2 ...whether you do this depends on the sample size

Page 8: Bayesian Inference!!!

Choosing PriorsChoose well, it will influence your results…

• CONJUGATE: using a complimentary distribution

• PRIOR INFORMATION: data from literature, pilot studies, prior meta-analyses, etc.

• UNINFORMATIVE: weak, but can be used to impose constraints and good if you have no information

Page 9: Bayesian Inference!!!

Uninformative Priors• Mean: Normal distribution (-∞, ∞)

• Standard deviation: Uniform distribution (0, ∞)

Page 10: Bayesian Inference!!!

Example of uninformative variance priors

Inverse gamma distribution

Inverse chi-square distribution

Page 11: Bayesian Inference!!!

Priors and Precision

Prior

Likelihood

Posterior

The influence of your priors and your likelihood in the posterior depends on their variance; lower variance, greater weight (and vice versa)

Page 12: Bayesian Inference!!!

So why use Bayes over likelihood?

• If using uninformative priors, results are ~LL

• Bayes can treat missing data as a parameter

• Better for tricky, less common distributions

• Complex data structures (e.g., hierarchical)

• If you want to include priors

Page 13: Bayesian Inference!!!

More Lepidoptera!

Page 14: Bayesian Inference!!!

Ti

Page 15: Bayesian Inference!!!

Ti

Page 16: Bayesian Inference!!!

Ti

Page 17: Bayesian Inference!!!

Ti

Page 18: Bayesian Inference!!!

Ti

Page 19: Bayesian Inference!!!

Ti

Page 20: Bayesian Inference!!!

Choosing priors• No prior information use

uninformative priors

• Uninformative priors:–Means normal– Standard Deviation uniform

Page 21: Bayesian Inference!!!

Prior for mean µ

Page 22: Bayesian Inference!!!

Prior for variance: τ2

Page 23: Bayesian Inference!!!

MCMC general process• Samples the posterior distribution you’ve generated

(prior + likelihood)

• Select starting value (e.g., 0, educated guess at parameter values, or moment based/least squares values)

• Algorithm structures search through parameter space (tries combinations of parameters simultaneously if multivariate model)

• Output is a posterior probability distribution

Page 24: Bayesian Inference!!!

Si2 = 0.02

Page 25: Bayesian Inference!!!

Si2 = 0.02

Si2 = 0.26

Page 26: Bayesian Inference!!!

Grand mean conclusions• Overall mean effect size = 0.32

• Posterior probability of positive effect size is 1, so we are almost certain the effect is positive.

Page 27: Bayesian Inference!!!

Example 2 – Missing Data!• Want to include polyandry as fixed

effect

• BUT missing data from 3 species

Bayes to the rescue!

Page 28: Bayesian Inference!!!

What we knowBusseola fusca = monandrous

Papilio machaon = polyandrous

Eurema hecabe = polyandrous

• Monandry < 40% multiple mates• Polyandry > 40% multiple mates

Page 29: Bayesian Inference!!!

So what do we do?Let’s estimate the values for the missing percentages!

Set different, and relatively uninformative priors for monandrous and polyandrous species

Page 30: Bayesian Inference!!!

Prior for XM

Page 31: Bayesian Inference!!!

Prior for XP

Page 32: Bayesian Inference!!!
Page 33: Bayesian Inference!!!

Final Notes & Re-CapAt the end of the day, it is really just another method to achieve a similar goal, the major difference is that you are using likelihood AND priors

• REMEMBER: Bayes is a great tool in the toolbox for when you are dealing with:– Missing data– Abnormal distributions– Complex data structures– Or have/want to include prior information