What do noisy datapoints tell us about the true signal? · 1/17/2012  · Overview... . . ....

Post on 16-Aug-2020

2 views 0 download

Transcript of What do noisy datapoints tell us about the true signal? · 1/17/2012  · Overview... . . ....

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

What do noisy datapoints tell us about thetrue signal?

C. R. Hogg

November 16, 2011

1/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Outline

Overview

Bayesian Analysis

Gaussian Processes

Scattering Curves

Conclusions

2/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Overview

Q (1/A)

Sca

tte

red

In

ten

sity

0

500

1000

1500

2000

2500

2 3 4 5 6 7 8

Ernest Rutherford (1871-1937)

“If your experiment needs statistics,you ought to have done a better experiment”

(Or: use better statistics!)

3/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Overview

Q (1/A)

Sca

tte

red

In

ten

sity

0

500

1000

1500

2000

2500

2 3 4 5 6 7 8

Ernest Rutherford (1871-1937)

“If your experiment needs statistics,you ought to have done a better experiment”

(Or: use better statistics!)

3/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Overview

Q (1/A)

Sca

tte

red

In

ten

sity

0

500

1000

1500

2000

2500

2 3 4 5 6 7 8

Ernest Rutherford (1871-1937)

“If your experiment needs statistics,you ought to have done a better experiment”

(Or: use better statistics!)

3/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Overview

Q (1/A)

Sca

tte

red

In

ten

sity

0

500

1000

1500

2000

2500

2 3 4 5 6 7 8

Ernest Rutherford (1871-1937)

“If your experiment needs statistics,you ought to have done a better experiment”

(Or: use better statistics!)

3/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Overview

Q (1/A)

Sca

tte

red

In

ten

sity

0

500

1000

1500

2000

2500

2 3 4 5 6 7 8

Ernest Rutherford (1871-1937)

“If your experiment needs statistics,you ought to have done a better experiment”

(Or: use better statistics!)

3/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Overview

Q (1/A)

Sca

tte

red

In

ten

sity

0

500

1000

1500

2000

2500

2 3 4 5 6 7 8

Ernest Rutherford (1871-1937)

“If your experiment needs statistics,you ought to have done a better experiment”

(Or: use better statistics!)

3/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Overview

Q (1/A)

Sca

tte

red

In

ten

sity

0

500

1000

1500

2000

2500

2 3 4 5 6 7 8

Ernest Rutherford (1871-1937)

“If your experiment needs statistics,you ought to have done a better experiment”

(Or: use better statistics!)

3/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Goals for the talk

Uncertainty in single quantity:

Qu

an

tity

of

Inte

rest

7

8

9

10

11

12

13

. . . in continuous functions:

1. Explain Bayesian analysis atconceptual level

2. Discuss quantifying uncertaintyin continuous functions

4/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Goals for the talk

Uncertainty in single quantity:

Qu

an

tity

of

Inte

rest

7

8

9

10

11

12

13

. . . in continuous functions:

1. Explain Bayesian analysis atconceptual level

2. Discuss quantifying uncertaintyin continuous functions

4/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Goals for the talk

Uncertainty in single quantity:

Qu

an

tity

of

Inte

rest

7

8

9

10

11

12

13

. . . in continuous functions:

1. Explain Bayesian analysis atconceptual level

2. Discuss quantifying uncertaintyin continuous functions

4/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Goals for the talk

Uncertainty in single quantity:

Qu

an

tity

of

Inte

rest

7

8

9

10

11

12

13

. . . in continuous functions:

1. Explain Bayesian analysis atconceptual level

2. Discuss quantifying uncertaintyin continuous functions

4/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Goals for the talk

Uncertainty in single quantity:

Qu

an

tity

of

Inte

rest

7

8

9

10

11

12

13

. . . in continuous functions:

1. Explain Bayesian analysis atconceptual level

2. Discuss quantifying uncertaintyin continuous functions

4/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Goals for the talk

Uncertainty in single quantity:

Qu

an

tity

of

Inte

rest

7

8

9

10

11

12

13

. . . in continuous functions:

same shape

stays inside

1. Explain Bayesian analysis atconceptual level

2. Discuss quantifying uncertaintyin continuous functions

4/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Why Bayes at NIST?

NIST’s mission:To promote U.S. innovation and industrial

competitiveness by advancing measurementscience, standards, and technology in ways

that enhance economic security and improveour quality of life.

NIST’s vision:NIST will be the world’s leader in creating

critical measurement solutions and promotingequitable standards. Our efforts stimulate

innovation, foster industrial competitiveness,and improve the quality of life.

NIST’s core competencies:• Measurement science• Rigorous traceability• Development and use of standards

• Measurement isextremely important atNIST

• Must quantifyuncertainty:“A measurement result is

complete only when accompanied

by a quantitative statement of its

uncertainty.”a

• Which language todiscuss uncertainty?

• If “probabilities”:Bayesian analysis

aNIST TN 1297

5/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Why Bayes at NIST?

NIST’s mission:To promote U.S. innovation and industrial

competitiveness by advancing measurementscience, standards, and technology in ways

that enhance economic security and improveour quality of life.

NIST’s vision:NIST will be the world’s leader in creating

critical measurement solutions and promotingequitable standards. Our efforts stimulate

innovation, foster industrial competitiveness,and improve the quality of life.

NIST’s core competencies:• Measurement science• Rigorous traceability• Development and use of standards

• Measurement isextremely important atNIST

• Must quantifyuncertainty:“A measurement result is

complete only when accompanied

by a quantitative statement of its

uncertainty.”a

• Which language todiscuss uncertainty?

• If “probabilities”:Bayesian analysis

aNIST TN 1297

5/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Why Bayes at NIST?

NIST’s mission:To promote U.S. innovation and industrial

competitiveness by advancing measurementscience, standards, and technology in ways

that enhance economic security and improveour quality of life.

NIST’s vision:NIST will be the world’s leader in creating

critical measurement solutions and promotingequitable standards. Our efforts stimulate

innovation, foster industrial competitiveness,and improve the quality of life.

NIST’s core competencies:• Measurement science• Rigorous traceability• Development and use of standards

• Measurement isextremely important atNIST

• Must quantifyuncertainty:“A measurement result is

complete only when accompanied

by a quantitative statement of its

uncertainty.”a

• Which language todiscuss uncertainty?

• If “probabilities”:Bayesian analysis

aNIST TN 1297

5/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Why Bayes at NIST?

NIST’s mission:To promote U.S. innovation and industrial

competitiveness by advancing measurementscience, standards, and technology in ways

that enhance economic security and improveour quality of life.

NIST’s vision:NIST will be the world’s leader in creating

critical measurement solutions and promotingequitable standards. Our efforts stimulate

innovation, foster industrial competitiveness,and improve the quality of life.

NIST’s core competencies:• Measurement science• Rigorous traceability• Development and use of standards

• Measurement isextremely important atNIST

• Must quantifyuncertainty:“A measurement result is

complete only when accompanied

by a quantitative statement of its

uncertainty.”a

• Which language todiscuss uncertainty?

• If “probabilities”:Bayesian analysis

aNIST TN 12975/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

What is Bayesian Analysis?

(Rev. Thomas Bayes, c. 1701 – 1761)

Essence of Bayes:• 2 questions for every

guess (i.e. every θ)1. How likely does it make

the actual data?

“LIKELIHOOD”

2. How plausible is it?

“PRIOR”

• Combine them to answerthe main question:

1. What is your newprobability, now thatyou’ve seen the data?“POSTERIOR”

6/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

What is Bayesian Analysis?

(Rev. Thomas Bayes, c. 1701 – 1761)

Essence of Bayes:• 2 questions for every

guess (i.e. every θ)1. How likely does it make

the actual data?

“LIKELIHOOD”

2. How plausible is it?

“PRIOR”

• Combine them to answerthe main question:

1. What is your newprobability, now thatyou’ve seen the data?“POSTERIOR”

6/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

What is Bayesian Analysis?

(Rev. Thomas Bayes, c. 1701 – 1761)

Essence of Bayes:• 2 questions for every

guess (i.e. every θ)1. How likely does it make

the actual data?

“LIKELIHOOD”

2. How plausible is it?

“PRIOR”

• Combine them to answerthe main question:

1. What is your newprobability, now thatyou’ve seen the data?“POSTERIOR”

6/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

What is Bayesian Analysis?

(Rev. Thomas Bayes, c. 1701 – 1761)

Essence of Bayes:• 2 questions for every

guess (i.e. every θ)1. How likely does it make

the actual data?

“LIKELIHOOD”

2. How plausible is it?

“PRIOR”

• Combine them to answerthe main question:

1. What is your newprobability, now thatyou’ve seen the data?

“POSTERIOR”

6/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

What is Bayesian Analysis?

(Rev. Thomas Bayes, c. 1701 – 1761)

Essence of Bayes:• 2 questions for every

guess (i.e. every θ)1. How likely does it make

the actual data?“LIKELIHOOD”

2. How plausible is it?“PRIOR”

• Combine them to answerthe main question:

1. What is your newprobability, now thatyou’ve seen the data?“POSTERIOR”

6/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

What is Bayesian Analysis?

(Rev. Thomas Bayes, c. 1701 – 1761)

Essence of Bayes:• 2 questions for every

guess (i.e. every θ)1. How likely does it make

the actual data?“LIKELIHOOD”

2. How plausible is it?“PRIOR”

• Combine them to answerthe main question:

1. What is your newprobability, now thatyou’ve seen the data?“POSTERIOR”

6/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Likelihood of function p(y |f )

Scattering vector Q (1/A)

Inte

nsi

ty (

nu

mb

er

of

cou

nts

)0

20

40

0.0 0.5 1.0

• Example: artificial dataset• Noise model: Poisson

p(y |f ) = f ye−f

y !

• Assume independentpixels

• Problem: not plausible• (What makes a function

“plausible”?)

7/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Likelihood of function p(y |f )

Scattering vector Q (1/A)

Inte

nsi

ty (

nu

mb

er

of

cou

nts

)0

20

40

0.0 0.5 1.0

• Example: artificial dataset• Noise model: Poisson

p(y |f ) = f ye−f

y !

• Assume independentpixels

• Problem: not plausible• (What makes a function

“plausible”?)

7/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Likelihood of function p(y |f )

Scattering vector Q (1/A)

Inte

nsi

ty (

nu

mb

er

of

cou

nts

)0

20

40

0.0 0.5 1.0

• Example: artificial dataset• Noise model: Poisson

p(y |f ) = f ye−f

y !

• Assume independentpixels

• Problem: not plausible• (What makes a function

“plausible”?)

7/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Likelihood of function p(y |f )

Scattering vector Q (1/A)

Inte

nsi

ty (

nu

mb

er

of

cou

nts

)0

20

40

0.0 0.5 1.0

• Example: artificial dataset• Noise model: Poisson

p(y |f ) = f ye−f

y !

• Assume independentpixels

• Problem: not plausible• (What makes a function

“plausible”?)

7/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Likelihood of function p(y |f )

Scattering vector Q (1/A)

Inte

nsi

ty (

nu

mb

er

of

cou

nts

)0

20

40

0.0 0.5 1.0

• Example: artificial dataset• Noise model: Poisson

p(y |f ) = f ye−f

y !

• Assume independentpixels

• Problem: not plausible• (What makes a function

“plausible”?)

7/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

“Plausibility” of function p(f )

Scattering vector Q (1/A)

Inte

nsity

(n

um

be

r o

f cou

nts

)0

20

40

0.0 0.5 1.0

• Assume smooth andcontinuous

• No functional formassumed

• Naturally: unrelated to data

8/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

“Plausibility” of function p(f )

Scattering vector Q (1/A)

Inte

nsi

ty (

nu

mb

er

of

cou

nts

)0

20

40

0.0 0.5 1.0

• Assume smooth andcontinuous

• No functional formassumed

• Naturally: unrelated to data

8/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

“Plausibility” of function p(f )

Scattering vector Q (1/A)

Inte

nsi

ty (

nu

mb

er

of

cou

nts

)0

20

40

0.0 0.5 1.0

• Assume smooth andcontinuous

• No functional formassumed

• Naturally: unrelated to data

8/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

“Plausibility” of function p(f )

Scattering vector Q (1/A)

Inte

nsi

ty (

nu

mb

er

of

cou

nts

)0

20

40

0.0 0.5 1.0

• Assume smooth andcontinuous

• No functional formassumed

• Naturally: unrelated to data

8/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

“Plausibility” of function p(f )

Scattering vector Q (1/A)

Inte

nsi

ty (

nu

mb

er

of

cou

nts

)0

20

40

0.0 0.5 1.0

• Assume smooth andcontinuous

• No functional formassumed

• Naturally: unrelated to data

8/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

“Plausibility” of function p(f )

Scattering vector Q (1/A)

Inte

nsi

ty (

nu

mb

er

of

cou

nts

)0

20

40

0.0 0.5 1.0

• Assume smooth andcontinuous

• No functional formassumed

• Naturally: unrelated to data

8/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

“Plausibility” of function p(f )

Scattering vector Q (1/A)

Inte

nsi

ty (

nu

mb

er

of

cou

nts

)0

20

40

0.0 0.5 1.0

• Assume smooth andcontinuous

• No functional formassumed

• Naturally: unrelated to data

8/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

“Plausibility” of function p(f )

Scattering vector Q (1/A)

Inte

nsi

ty (

nu

mb

er

of

cou

nts

)0

20

40

0.0 0.5 1.0

• Assume smooth andcontinuous

• No functional formassumed

• Naturally: unrelated to data

8/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Posterior probability p(f |y)

Scattering vector Q (1/A)

Inte

nsi

ty (

nu

mb

er

of

cou

nts

)0

20

40

0.0 0.5 1.0

• “Best of both worlds”:a Plausible curves, whichb fit the data

• To represent uncertainty:show many guesses

• (Or, summarize them. . . )

9/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Posterior probability p(f |y)

Scattering vector Q (1/A)

Inte

nsi

ty (

nu

mb

er

of

cou

nts

)0

20

40

0.0 0.5 1.0

• “Best of both worlds”:a Plausible curves, whichb fit the data

• To represent uncertainty:show many guesses

• (Or, summarize them. . . )

9/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Posterior probability p(f |y)

Scattering vector Q (1/A)

Inte

nsi

ty (

nu

mb

er

of

cou

nts

)0

20

40

0.0 0.5 1.0

• “Best of both worlds”:a Plausible curves, whichb fit the data

• To represent uncertainty:show many guesses

• (Or, summarize them. . . )

9/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Posterior probability p(f |y)

Scattering vector Q (1/A)

Inte

nsi

ty (

nu

mb

er

of

cou

nts

)0

20

40

0.0 0.5 1.0

• “Best of both worlds”:a Plausible curves, whichb fit the data

• To represent uncertainty:show many guesses

• (Or, summarize them. . . )

9/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Posterior probability p(f |y)

Scattering vector Q (1/A)

Inte

nsi

ty (

nu

mb

er

of

cou

nts

)0

20

40

0.0 0.5 1.0

• “Best of both worlds”:a Plausible curves, whichb fit the data

• To represent uncertainty:show many guesses

• (Or, summarize them. . . )

9/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Posterior probability p(f |y)

Scattering vector Q (1/A)

Inte

nsi

ty (

nu

mb

er

of

cou

nts

)0

20

40

0.0 0.5 1.0

• “Best of both worlds”:a Plausible curves, whichb fit the data

• To represent uncertainty:show many guesses

• (Or, summarize them. . . )

9/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Posterior probability p(f |y)Quantitative uncertainty visuals

0

5

10

15

20

25

30

35

40

0 0.2 0.4 0.6 0.8 1

Inte

nsi

ty (

no.

of

cou

nts

)

Q (1/A)

Noisy Data

9/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Posterior probability p(f |y)Quantitative uncertainty visuals

0

5

10

15

20

25

30

35

40

0 0.2 0.4 0.6 0.8 1

Inte

nsi

ty (

no.

of

cou

nts

)

Q (1/A)

Noisy DataTrue Curve!

9/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Recap: Bayesian denoisingPlausible curves

Scattering vector Q (1/A)

Inte

nsity

(num

ber

of co

unts

)0

20

40

0.0 0.5 1.0

10/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Recap: Bayesian denoisingCurves which fit the data

Scattering vector Q (1/A)

Inte

nsi

ty (

num

ber

of co

unts

)0

20

40

0.0 0.5 1.0

10/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Recap: Bayesian denoisingPlausible curves which fit the data

Scattering vector Q (1/A)

Inte

nsity

(num

ber

of co

unts

)0

20

40

0.0 0.5 1.0

10/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Random variables; Random functions

?

p = 16

16

16

16

16

16

F = , , , , ,

, , , , . . .

x

F(x

)

• Random variable F :an uncertain quantity

• calculate probabilitiesfor its values

• take “random draws”(roll the die,flip the coin. . . )

• Random function F (x)?

11/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Random variables; Random functions

?

p = 16

16

16

16

16

16

F = ,

, , , ,

, , , , . . .

x

F(x

)

• Random variable F :an uncertain quantity

• calculate probabilitiesfor its values

• take “random draws”(roll the die,flip the coin. . . )

• Random function F (x)?

11/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Random variables; Random functions

?

p = 16

16

16

16

16

16

F = , ,

, , ,

, , , , . . .

x

F(x

)

• Random variable F :an uncertain quantity

• calculate probabilitiesfor its values

• take “random draws”(roll the die,flip the coin. . . )

• Random function F (x)?

11/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Random variables; Random functions

?

p = 16

16

16

16

16

16

F = , , ,

, ,

, , , , . . .

x

F(x

)

• Random variable F :an uncertain quantity

• calculate probabilitiesfor its values

• take “random draws”(roll the die,flip the coin. . . )

• Random function F (x)?

11/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Random variables; Random functions

?

p = 16

16

16

16

16

16

F = , , , ,

,

, , , , . . .

x

F(x

)

• Random variable F :an uncertain quantity

• calculate probabilitiesfor its values

• take “random draws”(roll the die,flip the coin. . . )

• Random function F (x)?

11/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Random variables; Random functions

?

p = 16

16

16

16

16

16

F = , , , , ,

, , , , . . .

x

F(x

)

• Random variable F :an uncertain quantity

• calculate probabilitiesfor its values

• take “random draws”(roll the die,flip the coin. . . )

• Random function F (x)?

11/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Random variables; Random functions

?

p = 16

16

16

16

16

16

F = , , , , ,

,

, , , . . .

x

F(x

)

• Random variable F :an uncertain quantity

• calculate probabilitiesfor its values

• take “random draws”(roll the die,flip the coin. . . )

• Random function F (x)?

11/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Random variables; Random functions

?

p = 16

16

16

16

16

16

F = , , , , ,

, ,

, , . . .

x

F(x

)

• Random variable F :an uncertain quantity

• calculate probabilitiesfor its values

• take “random draws”(roll the die,flip the coin. . . )

• Random function F (x)?

11/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Random variables; Random functions

?

p = 16

16

16

16

16

16

F = , , , , ,

, , ,

, . . .

x

F(x

)

• Random variable F :an uncertain quantity

• calculate probabilitiesfor its values

• take “random draws”(roll the die,flip the coin. . . )

• Random function F (x)?

11/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Random variables; Random functions

?

p = 16

16

16

16

16

16

F = , , , , ,

, , , , . . .

x

F(x

)

• Random variable F :an uncertain quantity

• calculate probabilitiesfor its values

• take “random draws”(roll the die,flip the coin. . . )

• Random function F (x)?

11/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Random variables; Random functions

?

p = 16

16

16

16

16

16

F = , , , , ,

, , , , . . .

x

F(x

)

• Random variable F :an uncertain quantity

• calculate probabilitiesfor its values

• take “random draws”(roll the die,flip the coin. . . )

• Random function F (x)?

11/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Random variables; Random functions

?

p = 16

16

16

16

16

16

F = , , , , ,

, , , , . . .

x

F(x

)

• Random variable F :an uncertain quantity

• calculate probabilitiesfor its values

• take “random draws”(roll the die,flip the coin. . . )

• Random function F (x)?

11/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Random variables; Random functions

?

p = 16

16

16

16

16

16

F = , , , , ,

, , , , . . .

x

F(x

)

• Random variable F :an uncertain quantity

• calculate probabilitiesfor its values

• take “random draws”(roll the die,flip the coin. . . )

• Random function F (x)?

11/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Random variables; Random functions

?

p = 16

16

16

16

16

16

F = , , , , ,

, , , , . . .

x

F(x

)

• Random variable F :an uncertain quantity

• calculate probabilitiesfor its values

• take “random draws”(roll the die,flip the coin. . . )

• Random function F (x)?

11/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Random variables; Random functions

?

p = 16

16

16

16

16

16

F = , , , , ,

, , , , . . .

x

F(x

)

• Random variable F :an uncertain quantity

• calculate probabilitiesfor its values

• take “random draws”(roll the die,flip the coin. . . )

• Random function F (x)?

11/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Random variables; Random functions

?

p = 16

16

16

16

16

16

F = , , , , ,

, , , , . . .

x

F(x

)

• Random variable F :an uncertain quantity

• calculate probabilitiesfor its values

• take “random draws”(roll the die,flip the coin. . . )

• Random function F (x)?

11/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Random variables; Random functions

?

p = 16

16

16

16

16

16

F = , , , , ,

, , , , . . .

x

F(x

)

• Random variable F :an uncertain quantity

• calculate probabilitiesfor its values

• take “random draws”(roll the die,flip the coin. . . )

• Random function F (x)?

11/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Random variables; Random functions

?

p = 16

16

16

16

16

16

F = , , , , ,

, , , , . . .

x

F(x

)

• Random variable F :an uncertain quantity

• calculate probabilitiesfor its values

• take “random draws”(roll the die,flip the coin. . . )

• Random function F (x)?

11/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

How to think about “random functions”?

x

F(x

)

• Function: a collection ofindividual values

• Every value is a randomvariable, with. . .

1. variance2. correlation

• correlation × variance:covariance

• Gaussian Process:• Every point is a

Random Variable• Any (finite) subset has

Gaussian jointdistribution

12/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

How to think about “random functions”?

x

F(x

)

• Function: a collection ofindividual values

• Every value is a randomvariable, with. . .

1. variance2. correlation

• correlation × variance:covariance

• Gaussian Process:• Every point is a

Random Variable• Any (finite) subset has

Gaussian jointdistribution

12/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

How to think about “random functions”?

x

F(x

)

• Function: a collection ofindividual values

• Every value is a randomvariable, with. . .

1. variance2. correlation

• correlation × variance:covariance

• Gaussian Process:• Every point is a

Random Variable• Any (finite) subset has

Gaussian jointdistribution

12/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

How to think about “random functions”?

x

F(x

)

• Function: a collection ofindividual values

• Every value is a randomvariable, with. . .

1. variance2. correlation

• correlation × variance:covariance

• Gaussian Process:• Every point is a

Random Variable• Any (finite) subset has

Gaussian jointdistribution

12/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

How to think about “random functions”?

x

F(x

)

• Function: a collection ofindividual values

• Every value is a randomvariable, with. . .

1. variance2. correlation

• correlation × variance:covariance

• Gaussian Process:• Every point is a

Random Variable• Any (finite) subset has

Gaussian jointdistribution

12/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

How to think about “random functions”?

x

F(x

)

• Function: a collection ofindividual values

• Every value is a randomvariable, with. . .

1. variance2. correlation

• correlation × variance:covariance

• Gaussian Process:• Every point is a

Random Variable• Any (finite) subset has

Gaussian jointdistribution

12/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

How to think about “random functions”?

x

F(x

)

• Function: a collection ofindividual values

• Every value is a randomvariable, with. . .

1. variance2. correlation

• correlation × variance:covariance

• Gaussian Process:• Every point is a

Random Variable• Any (finite) subset has

Gaussian jointdistribution

12/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

How to think about “random functions”?

x

F(x

)

• Function: a collection ofindividual values

• Every value is a randomvariable, with. . .

1. variance2. correlation

• correlation × variance:covariance

• Gaussian Process:• Every point is a

Random Variable• Any (finite) subset has

Gaussian jointdistribution

12/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

How to think about “random functions”?

x

F(x

)

• Function: a collection ofindividual values

• Every value is a randomvariable, with. . .

1. variance2. correlation

• correlation × variance:covariance

• Gaussian Process:• Every point is a

Random Variable• Any (finite) subset has

Gaussian jointdistribution

12/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

How to think about “random functions”?

x

F(x

)

• Function: a collection ofindividual values

• Every value is a randomvariable, with. . .

1. variance2. correlation

• correlation × variance:covariance

• Gaussian Process:• Every point is a

Random Variable• Any (finite) subset has

Gaussian jointdistribution

12/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

How to think about “random functions”?

x

F(x

)

• Function: a collection ofindividual values

• Every value is a randomvariable, with. . .

1. variance2. correlation

• correlation × variance:covariance

• Gaussian Process:• Every point is a

Random Variable• Any (finite) subset has

Gaussian jointdistribution

12/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

How to think about “random functions”?

x

F(x

)

• Function: a collection ofindividual values

• Every value is a randomvariable, with. . .

1. variance2. correlation

• correlation × variance:covariance

• Gaussian Process:• Every point is a

Random Variable• Any (finite) subset has

Gaussian jointdistribution

12/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

How to read a Covariance Matrix

Str

ipe

In

ten

sity

= F

ea

ture

He

igh

t

X

-100

010

0

0 1 2 3X

01

23

0 1 2 3

Cov

2500

0

X

-100

0100

0 1 2 3X

01

23

0 1 2 3

Str

ipe

Wid

th =

Fe

atu

re W

idth

X

-100

01

00

0 1 2 3X

01

23

0 1 2 3

• How to read the matrix?

1. By individual entries2. As a whole

(central stripe)• Intensity:

height of features• Width:

width of features

13/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

How to read a Covariance Matrix

Str

ipe

In

ten

sity

= F

ea

ture

He

igh

t

X

-100

010

0

0 1 2 3X

01

23

0 1 2 3

Cov

2500

0

X

-100

0100

0 1 2 3X

01

23

0 1 2 3

Str

ipe

Wid

th =

Fe

atu

re W

idth

X

-100

01

00

0 1 2 3X

01

23

0 1 2 3

• How to read the matrix?1. By individual entries

2. As a whole(central stripe)

• Intensity:height of features

• Width:width of features

13/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

How to read a Covariance Matrix

Str

ipe

In

ten

sity

= F

ea

ture

He

igh

t

X

-100

010

0

0 1 2 3X

01

23

0 1 2 3

Cov

2500

0

X

-100

0100

0 1 2 3X

01

23

0 1 2 3

Str

ipe

Wid

th =

Fe

atu

re W

idth

X

-100

01

00

0 1 2 3X

01

23

0 1 2 3

• How to read the matrix?1. By individual entries

2. As a whole(central stripe)

• Intensity:height of features

• Width:width of features

13/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

How to read a Covariance Matrix

Str

ipe

In

ten

sity

= F

ea

ture

He

igh

t

X

-100

010

0

0 1 2 3X

01

23

0 1 2 3

Cov

2500

0

X

-100

0100

0 1 2 3X

01

23

0 1 2 3

Str

ipe

Wid

th =

Fe

atu

re W

idth

X

-100

01

00

0 1 2 3X

01

23

0 1 2 3

• How to read the matrix?1. By individual entries

2. As a whole(central stripe)

• Intensity:height of features

• Width:width of features

13/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

How to read a Covariance Matrix

Str

ipe

In

ten

sity

= F

ea

ture

He

igh

t

X

-100

010

0

0 1 2 3X

01

23

0 1 2 3

Cov

2500

0

X

-100

0100

0 1 2 3X

01

23

0 1 2 3

Str

ipe

Wid

th =

Fe

atu

re W

idth

X

-100

01

00

0 1 2 3X

01

23

0 1 2 3

• How to read the matrix?1. By individual entries2. As a whole

(central stripe)• Intensity:

height of features• Width:

width of features

13/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

How to read a Covariance MatrixS

trip

e I

nte

nsity

= F

ea

ture

He

igh

t

X-1

00

0100

0 1 2 3X

01

23

0 1 2 3

Cov

2500

0

X

-100

0100

0 1 2 3X

01

23

0 1 2 3

Str

ipe

Wid

th =

Fe

atu

re W

idth

X

-100

01

00

0 1 2 3X

01

23

0 1 2 3

• How to read the matrix?1. By individual entries2. As a whole

(central stripe)• Intensity:

height of features• Width:

width of features

13/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

How to read a Covariance MatrixS

trip

e I

nte

nsity

= F

ea

ture

He

igh

t

X-1

00

0100

0 1 2 3X

01

23

0 1 2 3

Cov

2500

0

X

-100

0100

0 1 2 3X

01

23

0 1 2 3

Str

ipe

Wid

th =

Fe

atu

re W

idth

X

-100

0100

0 1 2 3X

01

23

0 1 2 3

• How to read the matrix?1. By individual entries2. As a whole

(central stripe)• Intensity:

height of features• Width:

width of features

13/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Example 1: Hydrocarbon combustion(Dave Sheen and Wing Tsang, NIST, Div. 632)

Hydrocarbon burningsimulations

• Need (many!) reaction rateconstants

• Measured individually• Predictions are precise,

quantitative

, wrong

14/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Example 1: Hydrocarbon combustion(Dave Sheen and Wing Tsang, NIST, Div. 632)

Hydrocarbon burningsimulations

• Need (many!) reaction rateconstants

• Measured individually• Predictions are precise,

quantitative, wrong

14/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Hydrocarbon combustion: flame speed experiments

Flame speed

Fuel-to-oxygen ratio

Speed (cm/s)

0

10

20

30

40

0.6 0.8 1.0 1.2 1.4 1.6

• Datapoints (fromseveral experiments)

• Model: lengthscales` and σf

• ±1σ range• See also: individual curves• But where did this model

come from. . . ?

15/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Hydrocarbon combustion: flame speed experiments

Flame speed

Fuel-to-oxygen ratio

Speed (cm/s)

0

10

20

30

40

0.6 0.8 1.0 1.2 1.4 1.6

• Datapoints (fromseveral experiments)

• Model: lengthscales` and σf

• ±1σ range• See also: individual curves• But where did this model

come from. . . ?

15/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Hydrocarbon combustion: flame speed experiments

Flame speed

Fuel-to-oxygen ratio

Speed (cm/s)

0

10

20

30

40

0.6 0.8 1.0 1.2 1.4 1.6

• Datapoints (fromseveral experiments)

• Model: lengthscales` and σf

• ±1σ range

• See also: individual curves• But where did this model

come from. . . ?

15/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Hydrocarbon combustion: flame speed experiments

Flame speed

Fuel-to-oxygen ratio

Speed (cm/s)

0

10

20

30

40

0.6 0.8 1.0 1.2 1.4 1.6

• Datapoints (fromseveral experiments)

• Model: lengthscales` and σf

• ±1σ range• See also: individual curves

• But where did this modelcome from. . . ?

15/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Hydrocarbon combustion: flame speed experiments

Flame speed

Fuel-to-oxygen ratio

Speed (cm/s)

0

10

20

30

40

0.6 0.8 1.0 1.2 1.4 1.6

• Datapoints (fromseveral experiments)

• Model: lengthscales` and σf

• ±1σ range• See also: individual curves• But where did this model

come from. . . ?

15/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Occam’s razor, intro

MODEL 1:

flat

• William of Occamc. 1288 - c. 1348

• Gave us Occam’s Razor

• (slightly paraphrased inthe name of science)

• Claim: use probability,get this automatically

• (And, quantitative, too!)

• Example: 3 models. . .

16/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Occam’s razor, intro

MODEL 1:

flat

• William of Occamc. 1288 - c. 1348

• Gave us Occam’s Razor• (slightly paraphrased in

the name of science)

• Claim: use probability,get this automatically

• (And, quantitative, too!)

• Example: 3 models. . .

16/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Occam’s razor, intro

MODEL 1:

flat

• William of Occamc. 1288 - c. 1348

• Gave us Occam’s Razor• (slightly paraphrased in

the name of science)

• Claim: use probability,get this automatically

• (And, quantitative, too!)

• Example: 3 models. . .

16/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Occam’s razor, intro

MODEL 1:

flat

• William of Occamc. 1288 - c. 1348

• Gave us Occam’s Razor• (slightly paraphrased in

the name of science)

• Claim: use probability,get this automatically

• (And, quantitative, too!)

• Example: 3 models. . .

16/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Occam’s razor, intro

MODEL 2:

flat + line

• William of Occamc. 1288 - c. 1348

• Gave us Occam’s Razor• (slightly paraphrased in

the name of science)

• Claim: use probability,get this automatically

• (And, quantitative, too!)

• Example: 3 models. . .

16/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Occam’s razor, intro

MODEL 3:

flat + line + wiggle

• William of Occamc. 1288 - c. 1348

• Gave us Occam’s Razor• (slightly paraphrased in

the name of science)

• Claim: use probability,get this automatically

• (And, quantitative, too!)

• Example: 3 models. . .

16/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Occam’s razor in action

• Some models can explainmore datasets

• Each model is probabilitydistribution:

• Same total probabilityto distribute

• Which data actuallyobserved?

17/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Occam’s razor in action

• Some models can explainmore datasets

• Each model is probabilitydistribution:

• Same total probabilityto distribute

• Which data actuallyobserved?

17/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Occam’s razor in action

• Some models can explainmore datasets

• Each model is probabilitydistribution:

• Same total probabilityto distribute

• Which data actuallyobserved?

17/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Occam’s razor in action

• Some models can explainmore datasets

• Each model is probabilitydistribution:

• Same total probabilityto distribute

• Which data actuallyobserved?

17/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Occam’s razor in action

• Some models can explainmore datasets

• Each model is probabilitydistribution:

• Same total probabilityto distribute

• Which data actuallyobserved?

17/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Occam’s razor and Gaussian Processes

ℓℓℓ

σσ

σ

Probability

of Model:

0 1/9 1

• 9 models, varyingcomplexity

• Few datapoints (2):simple models preferred

• New data, some modelscan’t explain

• Three tiers1. Fit too poor2. Fit too good3. Just right

• Clear winner

18/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Occam’s razor and Gaussian Processes

ℓℓℓ

σσ

σ

Probability

of Model:

0 1/9 1

• 9 models, varyingcomplexity

• Few datapoints (2):simple models preferred

• New data, some modelscan’t explain

• Three tiers1. Fit too poor2. Fit too good3. Just right

• Clear winner

18/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Occam’s razor and Gaussian Processes

ℓℓℓ

σσ

σ

Probability

of Model:

0 1/9 1

• 9 models, varyingcomplexity

• Few datapoints (2):simple models preferred

• New data, some modelscan’t explain

• Three tiers1. Fit too poor2. Fit too good3. Just right

• Clear winner

18/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Occam’s razor and Gaussian Processes

ℓℓℓ

σσ

σ

Probability

of Model:

0 1/9 1

• 9 models, varyingcomplexity

• Few datapoints (2):simple models preferred

• New data, some modelscan’t explain

• Three tiers1. Fit too poor2. Fit too good3. Just right

• Clear winner

18/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Occam’s razor and Gaussian Processes

ℓℓℓ

σσ

σ

Probability

of Model:

0 1/9 1

• 9 models, varyingcomplexity

• Few datapoints (2):simple models preferred

• New data, some modelscan’t explain

• Three tiers1. Fit too poor2. Fit too good3. Just right

• Clear winner

18/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Occam’s razor and Gaussian Processes

ℓℓℓ

σσ

σ

Probability

of Model:

0 1/9 1

• 9 models, varyingcomplexity

• Few datapoints (2):simple models preferred

• New data, some modelscan’t explain

• Three tiers1. Fit too poor2. Fit too good3. Just right

• Clear winner

18/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Occam’s razor and Gaussian Processes

ℓℓℓ

σσ

σ

Probability

of Model:

0 1/9 1

• 9 models, varyingcomplexity

• Few datapoints (2):simple models preferred

• New data, some modelscan’t explain

• Three tiers1. Fit too poor2. Fit too good3. Just right

• Clear winner

18/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Occam’s razor and Gaussian Processes

ℓℓℓ

σσ

σ

Probability

of Model:

0 1/9 1

• 9 models, varyingcomplexity

• Few datapoints (2):simple models preferred

• New data, some modelscan’t explain

• Three tiers1. Fit too poor2. Fit too good3. Just right

• Clear winner

18/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Occam’s razor and Gaussian Processes

ℓℓℓ

σσ

σ

Probability

of Model:

0 1/9 1

• 9 models, varyingcomplexity

• Few datapoints (2):simple models preferred

• New data, some modelscan’t explain

• Three tiers1. Fit too poor2. Fit too good3. Just right

• Clear winner

18/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Occam’s razor and Gaussian Processes

ℓℓℓ

σσ

σ

Probability

of Model:

0 1/9 1

• 9 models, varyingcomplexity

• Few datapoints (2):simple models preferred

• New data, some modelscan’t explain

• Three tiers1. Fit too poor2. Fit too good3. Just right

• Clear winner

18/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Example 2: Metal Strain

(Adam Creuziger and Mark Iadicola, NIST, Div. 655)

• Testing stress/strain ofsteels (auto parts, etc.)

• Clamp flat plate; pushupwards on middle

• Measure:1. Stress: X-ray diffraction2. Strain: Digital imaging

of spray-paint pattern

• Can’t painteverywhere!

(Figures courtesy of Mark Iadicola)

19/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Example 2: Metal Strain

(Adam Creuziger and Mark Iadicola, NIST, Div. 655)

• Testing stress/strain ofsteels (auto parts, etc.)

• Clamp flat plate; pushupwards on middle

• Measure:1. Stress: X-ray diffraction2. Strain: Digital imaging

of spray-paint pattern• Can’t paint

everywhere!

(Figures courtesy of Mark Iadicola)

19/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Metal Strain: Preliminary Results

• Spheres representdatapoints

• Continuous surface• Uncertainty bounds ±1σ

• See also animations

• Competing model:anisotropic

• Occam’s razor lets uschoose!

• ∆ log(ML) = +183.4

• Suggestions forexperimental design

20/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Metal Strain: Preliminary Results

• Spheres representdatapoints

• Continuous surface

• Uncertainty bounds ±1σ• See also animations

• Competing model:anisotropic

• Occam’s razor lets uschoose!

• ∆ log(ML) = +183.4

• Suggestions forexperimental design

20/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Metal Strain: Preliminary Results

• Spheres representdatapoints

• Continuous surface• Uncertainty bounds ±1σ

• See also animations

• Competing model:anisotropic

• Occam’s razor lets uschoose!

• ∆ log(ML) = +183.4

• Suggestions forexperimental design

20/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Metal Strain: Preliminary Results

• Spheres representdatapoints

• Continuous surface• Uncertainty bounds ±1σ

• See also animations

• Competing model:anisotropic

• Occam’s razor lets uschoose!

• ∆ log(ML) = +183.4

• Suggestions forexperimental design

20/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Metal Strain: Preliminary Results

• Spheres representdatapoints

• Continuous surface• Uncertainty bounds ±1σ

• See also animations

• Competing model:anisotropic

• Occam’s razor lets uschoose!

• ∆ log(ML) = +183.4

• Suggestions forexperimental design

20/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Metal Strain: Preliminary Results

• Spheres representdatapoints

• Continuous surface• Uncertainty bounds ±1σ

• See also animations

• Competing model:anisotropic

• Occam’s razor lets uschoose!

• ∆ log(ML) = +183.4

• Suggestions forexperimental design

20/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Metal Strain: Preliminary Results

• Spheres representdatapoints

• Continuous surface• Uncertainty bounds ±1σ

• See also animations

• Competing model:anisotropic

• Occam’s razor lets uschoose!

• ∆ log(ML) = +183.4

• Suggestions forexperimental design

20/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Metal Strain: Preliminary Results

• Spheres representdatapoints

• Continuous surface• Uncertainty bounds ±1σ

• See also animations

• Competing model:anisotropic

• Occam’s razor lets uschoose!

• ∆ log(ML) = +183.4

• Suggestions forexperimental design

20/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Need to extend the modelS

trip

e I

nte

nsity

= F

ea

ture

He

igh

t

X-1

00

0100

0 1 2 3X

01

23

0 1 2 3

Cov

2500

0

X

-100

0100

0 1 2 3X

01

23

0 1 2 3

Str

ipe

Wid

th =

Fe

atu

re W

idth

X

-100

0100

0 1 2 3X

01

23

0 1 2 3

• Recall: how to readcovariance matrices“as a whole”

• Intensity:height of features

• Width:width of features

• Not flexible enough forreal data!

21/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Need to extend the modelS

trip

e I

nte

nsity

= F

ea

ture

He

igh

t

X-1

00

0100

0 1 2 3X

01

23

0 1 2 3

Cov

2500

0

X

-100

0100

0 1 2 3X

01

23

0 1 2 3

Str

ipe

Wid

th =

Fe

atu

re W

idth

X

-100

0100

0 1 2 3X

01

23

0 1 2 3

• Recall: how to readcovariance matrices“as a whole”

• Intensity:height of features

• Width:width of features

• Not flexible enough forreal data!

21/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Two extensions

Va

ryin

g W

idth

s

X

-100

010

0

0 1 2 3X

01

23

0 1 2 3

Cov

2500

0

X

-100

0100

0 1 2 3X

01

23

0 1 2 3

Mu

ltip

le C

on

trib

utio

ns

X

-100

01

00

0 1 2 3X

01

23

0 1 2 3

• Two main extensions. . .

1. Varying Feature widths• ` → `(X )

2. Multiple contributions• Background everywhere• Localized “peak” regions

22/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Two extensionsV

ary

ing

Wid

ths

X-1

00

0100

0 1 2 3X

01

23

0 1 2 3

Cov

2500

0

X

-100

0100

0 1 2 3X

01

23

0 1 2 3

Mu

ltip

le C

on

trib

utio

ns

X

-100

01

00

0 1 2 3X

01

23

0 1 2 3

• Two main extensions. . .1. Varying Feature widths

• ` → `(X )

2. Multiple contributions• Background everywhere• Localized “peak” regions

22/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Two extensionsV

ary

ing

Wid

ths

X-1

00

0100

0 1 2 3X

01

23

0 1 2 3

Cov

2500

0

X

-100

0100

0 1 2 3X

01

23

0 1 2 3

Mu

ltip

le C

on

trib

utio

ns

X

-100

0100

0 1 2 3X

01

23

0 1 2 3

• Two main extensions. . .1. Varying Feature widths

• ` → `(X )

2. Multiple contributions• Background everywhere• Localized “peak” regions

22/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Simulated XRD from 2 nm Au nanoparticles

0

500

1000

1500

2000

2500

0 5 10 15 20 25 30 35

I(Q

)

True function

110

120

130

140

150

160

170

23 24 25 26 27 28 29 30

Q (1/A)

True function

• Core/shell structure• Shell atoms vibrate more• Correlated thermal

motion(Signature: hi-Qoscillations)

• Problem: Poisson noiseswamps these oscillations!

• Changing feature widths:use `(Q)

X

01

23

0 1 2 3

23/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Simulated XRD from 2 nm Au nanoparticles

0

500

1000

1500

2000

2500

0 5 10 15 20 25 30 35

I(Q

)

True function

110

120

130

140

150

160

170

23 24 25 26 27 28 29 30

Q (1/A)

True function

• Core/shell structure• Shell atoms vibrate more• Correlated thermal

motion(Signature: hi-Qoscillations)

• Problem: Poisson noiseswamps these oscillations!

• Changing feature widths:use `(Q)

X

01

23

0 1 2 3

23/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Simulated XRD from 2 nm Au nanoparticles

0

500

1000

1500

2000

2500

0 5 10 15 20 25 30 35

I(Q

)

True function

110

120

130

140

150

160

170

23 24 25 26 27 28 29 30

Q (1/A)

True function

• Core/shell structure• Shell atoms vibrate more• Correlated thermal

motion(Signature: hi-Qoscillations)

• Problem: Poisson noiseswamps these oscillations!

• Changing feature widths:use `(Q)

X

01

23

0 1 2 3

23/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Simulated XRD from 2 nm Au nanoparticles

0

500

1000

1500

2000

2500

0 5 10 15 20 25 30 35

I(Q

)

Noisy dataTrue function

110

120

130

140

150

160

170

23 24 25 26 27 28 29 30

Q (1/A)

Noisy dataTrue function

• Core/shell structure• Shell atoms vibrate more• Correlated thermal

motion(Signature: hi-Qoscillations)

• Problem: Poisson noiseswamps these oscillations!

• Changing feature widths:use `(Q)

X

01

23

0 1 2 3

23/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Simulated XRD from 2 nm Au nanoparticles

0

500

1000

1500

2000

2500

0 5 10 15 20 25 30 35

I(Q

)

Noisy dataTrue function

110

120

130

140

150

160

170

23 24 25 26 27 28 29 30

Q (1/A)

Noisy dataTrue function

• Core/shell structure• Shell atoms vibrate more• Correlated thermal

motion(Signature: hi-Qoscillations)

• Problem: Poisson noiseswamps these oscillations!

• Changing feature widths:use `(Q)

X

01

23

0 1 2 3

23/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Denoising results

110

120

130

140

150

160

170

23 24 25 26 27 28 29 30

Q (1/A)

Noisy dataTrue function

AWS (raw data)• AWS: jagged; loses signal

at Q = 26A−1

• Wavelets: smooth, but stilllose signal

• Bayes: also smooth, butkeeps signal

• Uncertainty boundscapture true function

24/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Denoising results

110

120

130

140

150

160

170

23 24 25 26 27 28 29 30

Q (1/A)

Noisy dataTrue function

Wavelets• AWS: jagged; loses signal

at Q = 26A−1

• Wavelets: smooth, but stilllose signal

• Bayes: also smooth, butkeeps signal

• Uncertainty boundscapture true function

24/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Denoising results

110

120

130

140

150

160

170

23 24 25 26 27 28 29 30

Q (1/A)

Noisy dataTrue function

Bayesian• AWS: jagged; loses signal

at Q = 26A−1

• Wavelets: smooth, but stilllose signal

• Bayes: also smooth, butkeeps signal

• Uncertainty boundscapture true function

24/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Denoising results

110

120

130

140

150

160

170

23 24 25 26 27 28 29 30

Q (1/A)

Noisy dataBayesian

True functionAWS (raw data)

Wavelets

• AWS: jagged; loses signalat Q = 26A−1

• Wavelets: smooth, but stilllose signal

• Bayes: also smooth, butkeeps signal

• Uncertainty boundscapture true function

24/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Residuals

Residuals vs. noisy data

MSR

Pro

ba

bili

ty D

en

sity

0

AW

S

AW

S (

raw

data

)

Bayesi

an m

ean

Wa

vele

ts

Bayesi

an d

raw

s

0.240 0.245 0.250

Residuals vs. true curve

MSR

Pro

ba

bili

ty D

en

sity

0

AW

S

AW

S (

raw

data

)

Bayesi

an m

ean

Wa

vele

ts

Bayesi

an d

raw

s

Tru

e c

urv

e

0.000 0.005 0.010

• Need: global fidelitymeasure

• Mean square residuals . . .1. vs. noisy data

• AWS looks best2. vs. true curve

• Bayes is best• AWS “good” score:

was overfitting noise!

25/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Residuals

Residuals vs. noisy data

MSR

Pro

ba

bili

ty D

en

sity

0

AW

S

AW

S (

raw

data

)

Bayesi

an m

ean

Wa

vele

ts

Bayesi

an d

raw

s0.240 0.245 0.250

Residuals vs. true curve

MSR

Pro

ba

bili

ty D

en

sity

0

AW

S

AW

S (

raw

data

)

Bayesi

an m

ean

Wa

vele

ts

Bayesi

an d

raw

s

Tru

e c

urv

e

0.000 0.005 0.010

• Need: global fidelitymeasure

• Mean square residuals . . .1. vs. noisy data

• AWS looks best

2. vs. true curve• Bayes is best• AWS “good” score:

was overfitting noise!

25/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Residuals

Residuals vs. noisy data

MSR

Pro

ba

bili

ty D

en

sity

0

AW

S

AW

S (

raw

data

)

Bayesi

an m

ean

Wa

vele

ts

Bayesi

an d

raw

s0.240 0.245 0.250

Residuals vs. true curve

MSR

Pro

ba

bili

ty D

en

sity

0

AW

S

AW

S (

raw

data

)

Bayesia

n m

ean

Wa

vele

ts

Bayesi

an d

raw

s

Tru

e c

urv

e

0.000 0.005 0.010

• Need: global fidelitymeasure

• Mean square residuals . . .1. vs. noisy data

• AWS looks best2. vs. true curve

• Bayes is best• AWS “good” score:

was overfitting noise!

25/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

TiO2 nanoparticles

NIST SRM 1898:

(Ohno et al., J. Catalysis, 2011)

• X-ray powder diffractionfrom 20 nm TiO2nanoparticles

• Motivations:1. Real-world example

(Violates ourassumptions)

2. More difficult data(contains feature-freebackground regions)

26/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Comparing the fits

Q (1/A)

0

1000

2000

3000

1 2 3 4 5 6 7

Q (1/A)

200

400

600

800

1000

3.2 3.3 3.4

• All handle sharp peaks• Every technique misses a

few features: AWS,wavelets, even Bayes

27/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Comparing the fits

Q (1/A)

0

1000

2000

3000

1 2 3 4 5 6 7

Q (1/A)

80

100

120

5.8 5.9

• All handle sharp peaks• Every technique misses a

few features: AWS,wavelets, even Bayes

27/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Comparing the fits

Q (1/A)

0

1000

2000

3000

1 2 3 4 5 6 7

Q (1/A)

100

200

7.0 7.2

• All handle sharp peaks• Every technique misses a

few features: AWS,wavelets, even Bayes

27/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Comparing the fits

Q (1/A)

0

1000

2000

3000

1 2 3 4 5 6 7

Q (1/A)

100

120

6.2 6.4

• All handle sharp peaks• Every technique misses a

few features: AWS,wavelets, even Bayes

27/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Comparing the fits

Q (1/A)

0

1000

2000

3000

1 2 3 4 5 6 7

Q (1/A)

80

100

120

5.2 5.3

• All handle sharp peaks• Every technique misses a

few features: AWS,wavelets, even Bayes

27/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

ResidualsResiduals vs. quasi-true curve

MSR

Pro

ba

bili

ty D

en

sity

AW

S

Bayesi

an M

ean

Wa

vele

ts

0.025 0.030 0.035

Training on

Even Points

Mean MSR

Ove

rfitt

ing

0.0

00

0.0

20

Baye

sA

WS

Wa

vele

ts

0.170 0.180

Training on

Odd Points

Mean MSR

Ove

rfitt

ing

0.0

00

0.0

20

Baye

sA

WS

Wa

vele

ts

0.170 0.180

• Bayes single-curvecomparable to benchmarks

• Cross-validation:(Checking for overfitting)Bayes is best. . .

1. In both categories2. For both training sets

28/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

ResidualsResiduals vs. quasi-true curve

MSR

Pro

ba

bili

ty D

en

sity

AW

S

Bayesi

an M

ean

Wa

vele

ts

0.025 0.030 0.035

Training on

Even Points

Mean MSR

Ove

rfittin

g0.0

00

0.0

20

Baye

sA

WS

Wa

vele

ts

0.170 0.180

Training on

Odd Points

Mean MSR

Ove

rfittin

g0.0

00

0.0

20

Bayes

AW

S

Wa

vele

ts

0.170 0.180

• Bayes single-curvecomparable to benchmarks

• Cross-validation:(Checking for overfitting)Bayes is best. . .

1. In both categories2. For both training sets

28/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Recap: Bayesian Concepts

Scattering vector Q (1/A)

Inte

nsity

(n

um

be

r o

f co

un

ts)

02

04

0

0.0 0.5 1.0

• Bayesian analysis:using probabilitiesto describe uncertainty

• choose answers withboth plausibilityand data fit

• a natural framework formodel selectionconcepts(Occam’s razor)

29/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Recap: Uncertainty in continuous functions

110

120

130

140

150

160

170

23 24 25 26 27 28 29 30

Q (1/A)

Noisy dataBayesian

True functionAWS (raw data)

Wavelets

• Gaussian Processes: canstipulate smoothness,without worrying aboutfunctional form

• Open-source softwarepackage

• Very flexible: can help avariety of projects

30/31

. .Overview

.

.

. . . .

Bayesian Analysis. . . . .. . . . .

Gaussian Processes. .. . .. . .

Scattering Curves. . .Conclusions

Acknowledgements

• Team members: Igor Levin, Kate Mullen• Collaborators:

• Flame speed : Dave Sheen, Wing Tsang• Metal Strain: Adam Creuziger, Mark Iadicola

• WERB readers: Victor Krayzmann, Adam Pintar• Statistical guidance: Antonio Possolo, Blaza Toman

31/31