CS348B Lecture 9Pat Hanrahan, Spring 2005 Overview Earlier lecture Statistical sampling and Monte...

29
CS348B Lecture 9 Pat Hanrahan, Spring 2005 Overview Earlier lecture Statistical sampling and Monte Carlo integration Last lecture Signal processing view of sampling Today Variance reduction Importance sampling Stratified sampling Multidimensional sampling patterns Discrepancy and Quasi-Monte Carlo Latter Path tracing for interreflection Density estimation

Transcript of CS348B Lecture 9Pat Hanrahan, Spring 2005 Overview Earlier lecture Statistical sampling and Monte...

Page 1: CS348B Lecture 9Pat Hanrahan, Spring 2005 Overview Earlier lecture Statistical sampling and Monte Carlo integration Last lecture Signal processing view.

CS348B Lecture 9 Pat Hanrahan, Spring 2005

Overview

Earlier lecture Statistical sampling and Monte Carlo integration

Last lecture Signal processing view of sampling

Today Variance reduction Importance sampling Stratified sampling Multidimensional sampling patterns Discrepancy and Quasi-Monte Carlo

Latter Path tracing for interreflection Density estimation

Page 2: CS348B Lecture 9Pat Hanrahan, Spring 2005 Overview Earlier lecture Statistical sampling and Monte Carlo integration Last lecture Signal processing view.

CS348B Lecture 9 Pat Hanrahan, Spring 2005

Cameras

( , , ) ( ) ( ) cosT A

R L x t P x S t dA d dt

Source: Cook, Porter, Carpenter, 1984Source: Mitchell, 1991

Depth of FieldMotion Blur

Page 3: CS348B Lecture 9Pat Hanrahan, Spring 2005 Overview Earlier lecture Statistical sampling and Monte Carlo integration Last lecture Signal processing view.

CS348B Lecture 9 Pat Hanrahan, Spring 2005

Variance

1 shadow ray per eye ray 16 shadow rays per eye ray

Page 4: CS348B Lecture 9Pat Hanrahan, Spring 2005 Overview Earlier lecture Statistical sampling and Monte Carlo integration Last lecture Signal processing view.

CS348B Lecture 9 Pat Hanrahan, Spring 2005

Variance

Definition

Variance decreases with sample size

2

2 2

[ ] [( [ ]) ]

[ ] [ ]

V Y E Y E Y

E Y E Y

2 21 1

1 1 1 1[ ] [ ] [ ] [ ]

N N

i ii i

V Y V Y NV Y V YN N N N

Page 5: CS348B Lecture 9Pat Hanrahan, Spring 2005 Overview Earlier lecture Statistical sampling and Monte Carlo integration Last lecture Signal processing view.

CS348B Lecture 9 Pat Hanrahan, Spring 2005

Variance Reduction

Efficiency measure

If one technique has twice the variance as another technique, then it takes twice as many samples to achieve the same variance

If one technique has twice the cost of another technique with the same variance, then it takes twice as much time to achieve the same variance

Techniques to increase efficiency

Importance sampling

Stratified sampling

1Efficiency

Variance Cost

Page 6: CS348B Lecture 9Pat Hanrahan, Spring 2005 Overview Earlier lecture Statistical sampling and Monte Carlo integration Last lecture Signal processing view.

CS348B Lecture 9 Pat Hanrahan, Spring 2005

Biasing

Previously used a uniform probability distribution

Can use another probability distribution

But must change the estimator

~ ( )iX p x

( )

( )i

ii

f XY

p X

Page 7: CS348B Lecture 9Pat Hanrahan, Spring 2005 Overview Earlier lecture Statistical sampling and Monte Carlo integration Last lecture Signal processing view.

CS348B Lecture 9 Pat Hanrahan, Spring 2005

Unbiased Estimate

Probability

Estimator

~ ( )iX p x

( )

( )i

ii

f XY

p X

( )[ ]

( )

( )( )

( )

( )

ii

i

i

i

f XE Y E

p X

f Xp x dx

p X

f x dx

I

Page 8: CS348B Lecture 9Pat Hanrahan, Spring 2005 Overview Earlier lecture Statistical sampling and Monte Carlo integration Last lecture Signal processing view.

CS348B Lecture 9 Pat Hanrahan, Spring 2005

Importance Sampling

( )( )

[ ]

1( )

[ ]

1

f xp x dx dx

E f

f x dxE f

( )( )

[ ]

f xp x

E f

Sample according to f

Page 9: CS348B Lecture 9Pat Hanrahan, Spring 2005 Overview Earlier lecture Statistical sampling and Monte Carlo integration Last lecture Signal processing view.

CS348B Lecture 9 Pat Hanrahan, Spring 2005

Importance Sampling

Variance

2 2[ ] [ ] [ ]V f E f E f 2

2

2

2

( )[ ] ( )

( )

( ) ( )

( ) / [ ] [ ]

[ ] ( )

[ ]

f xE f p x dx

p x

f x f xdx

f x E f E f

E f f x dx

E f

( )( )

[ ]

f xp x

E f

( )( )

( )

f xf x

p x

Sample according to f

2[ ] 0V f

Zero variance!

Page 10: CS348B Lecture 9Pat Hanrahan, Spring 2005 Overview Earlier lecture Statistical sampling and Monte Carlo integration Last lecture Signal processing view.

CS348B Lecture 9 Pat Hanrahan, Spring 2005

Example

method Sampling

function

variance Samples needed for standard error of 0.008

importance

(6-x)/16 56.8N-1 887,500

importance

1/4 21.3N-1 332,812

importance

(x+2)/16 6.4N-1 98,432

importance

x/8 0 1

stratified 1/4 21.3N-3 70

4

0

8xdxI Peter Shirley – Realistic Ray Tracing

Page 11: CS348B Lecture 9Pat Hanrahan, Spring 2005 Overview Earlier lecture Statistical sampling and Monte Carlo integration Last lecture Signal processing view.

CS348B Lecture 9 Pat Hanrahan, Spring 2005

Examples

Projected solid angle

4 eye rays per pixel100 shadow rays

Area

4 eye rays per pixel100 shadow rays

Page 12: CS348B Lecture 9Pat Hanrahan, Spring 2005 Overview Earlier lecture Statistical sampling and Monte Carlo integration Last lecture Signal processing view.

CS348B Lecture 9 Pat Hanrahan, Spring 2005

Irradiance

Generate cosine weighted distribution

2

( )cosi i i i

H

E L d

( ) cosp d d

Page 13: CS348B Lecture 9Pat Hanrahan, Spring 2005 Overview Earlier lecture Statistical sampling and Monte Carlo integration Last lecture Signal processing view.

CS348B Lecture 9 Pat Hanrahan, Spring 2005

Cosine Weighted Distribution

2

0

22

0

2

0

))(cos21(2)sin()cos()cos(

2

dddH

)()()sin()cos(

),()sin()cos(

),(

pppddddp

2

1)( p

21 U

12 U

22

1)( 0

0

0

0

dP

)sin()cos(2)( p

)(sin)(sin)sin()cos(2)( 02

00

20

0

0

dP

)(sin 22 U )arcsin( 2U

Page 14: CS348B Lecture 9Pat Hanrahan, Spring 2005 Overview Earlier lecture Statistical sampling and Monte Carlo integration Last lecture Signal processing view.

CS348B Lecture 9 Pat Hanrahan, Spring 2005

Sampling a Circle

1

2

2 U

r U

Equi-Areal

Page 15: CS348B Lecture 9Pat Hanrahan, Spring 2005 Overview Earlier lecture Statistical sampling and Monte Carlo integration Last lecture Signal processing view.

CS348B Lecture 9 Pat Hanrahan, Spring 2005

Shirley’s Mapping

1

2

14

r U

U

U

Page 16: CS348B Lecture 9Pat Hanrahan, Spring 2005 Overview Earlier lecture Statistical sampling and Monte Carlo integration Last lecture Signal processing view.

CS348B Lecture 9 Pat Hanrahan, Spring 2005

Stratified Sampling

Stratified sampling is like jittered sampling

Allocate samples per region

New variance

Thus, if the variance in regions is less than the overall variance, there will be a reduction in resulting variance

For example: An edge through a pixel

21

1[ ] [ ]

N

N ii

V F V FN

2 1.51

1 [ ][ ] [ ]

NE

N ji

V FV F V F

N N

1

1 N

N ii

F FN

Page 17: CS348B Lecture 9Pat Hanrahan, Spring 2005 Overview Earlier lecture Statistical sampling and Monte Carlo integration Last lecture Signal processing view.

CS348B Lecture 9 Pat Hanrahan, Spring 2005

Mitchell 91

Uniform random Spectrally optimized

Page 18: CS348B Lecture 9Pat Hanrahan, Spring 2005 Overview Earlier lecture Statistical sampling and Monte Carlo integration Last lecture Signal processing view.

CS348B Lecture 9 Pat Hanrahan, Spring 2005

Discrepancy

x

y

( , )( , )

( , ) number of samples in

n x yx y xy

N

A xy

n x y A

,max ( , )Nx y

D x y

Page 19: CS348B Lecture 9Pat Hanrahan, Spring 2005 Overview Earlier lecture Statistical sampling and Monte Carlo integration Last lecture Signal processing view.

CS348B Lecture 9 Pat Hanrahan, Spring 2005

Theorem on Total Variation

Theorem:

Proof: Integrate by parts

1

1( ) ( ) ( )

N

i Ni

f X f x dx V f DN

( ) ( )1ix x x

x N

1

0

( )( ) [ 1]

( )( )

( ) ( )( ) ( )

( )( )

i

N N

x xf x dx

Nx

f x dxxf x f x

f x dx x dxx x

f xD dx V f D

x

Page 20: CS348B Lecture 9Pat Hanrahan, Spring 2005 Overview Earlier lecture Statistical sampling and Monte Carlo integration Last lecture Signal processing view.

CS348B Lecture 9 Pat Hanrahan, Spring 2005

Quasi-Monte Carlo Patterns

Radical inverse (digit reverse)

of integer i in integer base b

Hammersley points

Halton points (sequential)

2 1 0

0 1 2( ) 0.i

b i

i d d d d

i d d d d

1 1 .1 1/22 10 .01 1/4

3 11 .11 3/44 100 .001 3/85 101 .101 5/8

2( )i

2 3 5( / , ( ), ( ), ( ), )i N i i i 1log

( )d

N

ND O

N

2 3 5( ( ), ( ), ( ), )i i i log

( )d

N

ND O

N

Page 21: CS348B Lecture 9Pat Hanrahan, Spring 2005 Overview Earlier lecture Statistical sampling and Monte Carlo integration Last lecture Signal processing view.

CS348B Lecture 9 Pat Hanrahan, Spring 2005

Hammersly Points

2 3 5( / , ( ), ( ), ( ), )i N i i i

Page 22: CS348B Lecture 9Pat Hanrahan, Spring 2005 Overview Earlier lecture Statistical sampling and Monte Carlo integration Last lecture Signal processing view.

CS348B Lecture 9 Pat Hanrahan, Spring 2005

Edge Discrepancy

ax by c

Note: SGI IR Multisampling extension: 8x8 subpixel grid; 1,2,4,8 samples

Page 23: CS348B Lecture 9Pat Hanrahan, Spring 2005 Overview Earlier lecture Statistical sampling and Monte Carlo integration Last lecture Signal processing view.

CS348B Lecture 9 Pat Hanrahan, Spring 2005

Low-Discrepancy Patterns

Process 16 points 256 points 1600 points

Zaremba 0.0504 0.00478 0.00111

Jittered 0.0538 0.00595 0.00146

Poisson-Disk

0.0613 0.00767 0.00241

N-Rooks 0.0637 0.0123 0.00488

Random 0.0924 0.0224 0.00866

Discrepancy of random edges, From Mitchell (1992)

Random sampling converges as N-1/2

Zaremba converges faster and has lower discrepancy

Zaremba has a relatively poor blue noise spectraJittered and Poisson-Disk recommended

Page 24: CS348B Lecture 9Pat Hanrahan, Spring 2005 Overview Earlier lecture Statistical sampling and Monte Carlo integration Last lecture Signal processing view.

CS348B Lecture 9 Pat Hanrahan, Spring 2005

High-dimensional Sampling

Numerical quadrature

For a given error …

Random sampling

For a given variance …

1

1 1~

dE

n N

1/ 21/ 2

1~ ~E V

N

Monte Carlo requires fewer samplesfor the same error in high dimensional spaces

Page 25: CS348B Lecture 9Pat Hanrahan, Spring 2005 Overview Earlier lecture Statistical sampling and Monte Carlo integration Last lecture Signal processing view.

CS348B Lecture 9 Pat Hanrahan, Spring 2005

Block Design

a

bc

d

a

bc

d

a

b

c

d

a

b

c

d

Alphabet of size n

Each symbol appears exactly once ineach row and column

Rows and columns are stratified

Latin Square

Page 26: CS348B Lecture 9Pat Hanrahan, Spring 2005 Overview Earlier lecture Statistical sampling and Monte Carlo integration Last lecture Signal processing view.

CS348B Lecture 9 Pat Hanrahan, Spring 2005

Block Design

a

a

a

a

N-Rook Pattern

Incomplete block design

Replaced n2 samples with n samples

Permutations:

Generalizations: N-queens, 2D projection

1 2( ( ), ( ), ( ))di i i

( {1,2,3,4}, {4,2,3,1})x y

Page 27: CS348B Lecture 9Pat Hanrahan, Spring 2005 Overview Earlier lecture Statistical sampling and Monte Carlo integration Last lecture Signal processing view.

CS348B Lecture 9 Pat Hanrahan, Spring 2005

Space-time Patterns

Distribute samples in time

Complete in space

Samples in space should have blue-noise spectrum

Incomplete in time

Decorrelate space and time

Nearby samples in space should differ greatly in time

15

4

10

1

3

13

6

8

0

11

5

14

12

2

9

7

Pan-diagonal Magic Square

15

4

10

1

3

136

8

0 11

5

14 12

2

9

7

Cook Pattern

Page 28: CS348B Lecture 9Pat Hanrahan, Spring 2005 Overview Earlier lecture Statistical sampling and Monte Carlo integration Last lecture Signal processing view.

CS348B Lecture 9 Pat Hanrahan, Spring 2005

Path Tracing

4 eye rays per pixel16 shadow rays per eye ray

64 eye rays per pixel1 shadow ray per eye ray

Complete Incomplete

Page 29: CS348B Lecture 9Pat Hanrahan, Spring 2005 Overview Earlier lecture Statistical sampling and Monte Carlo integration Last lecture Signal processing view.

CS348B Lecture 9 Pat Hanrahan, Spring 2005

Views of Integration

1. Signal processing

Sampling and reconstruction, aliasing and antialiasing

Blue noise good

2. Statistical sampling (Monte Carlo)

Sampling like polling

Variance

High dimensional sampling: 1/N1/2

3. Quasi Monte Carlo

Discrepancy

Asymptotic efficiency in high dimensions

4. Numerical

Quadrature/Integration rules

Smooth functions