Basic Principles of Imaging and Photometry Lecture #2 Thanks to Shree Nayar, Ravi Ramamoorthi, Pat...

29
Basic Principles of Imaging and Photometry Lecture #2 Thanks to Shree Nayar, Ravi Ramamoorthi, Pat Hanra

Transcript of Basic Principles of Imaging and Photometry Lecture #2 Thanks to Shree Nayar, Ravi Ramamoorthi, Pat...

Page 1: Basic Principles of Imaging and Photometry Lecture #2 Thanks to Shree Nayar, Ravi Ramamoorthi, Pat Hanrahan.

Basic Principles of Imaging and Photometry

Lecture #2

Thanks to Shree Nayar, Ravi Ramamoorthi, Pat Hanrahan

Page 2: Basic Principles of Imaging and Photometry Lecture #2 Thanks to Shree Nayar, Ravi Ramamoorthi, Pat Hanrahan.

Computer Vision: Building Machines that See

Lighting

Scene

Camera

Computer

Physical Models

Scene Interpretation

We need to understand the Geometric and Radiometric relationsbetween the scene and its image.

Page 3: Basic Principles of Imaging and Photometry Lecture #2 Thanks to Shree Nayar, Ravi Ramamoorthi, Pat Hanrahan.

Camera Obscura, Gemma Frisius, 1558

1558A Brief History of Images

Page 4: Basic Principles of Imaging and Photometry Lecture #2 Thanks to Shree Nayar, Ravi Ramamoorthi, Pat Hanrahan.

Lens Based Camera Obscura, 1568

15581568

A Brief History of Images

Page 5: Basic Principles of Imaging and Photometry Lecture #2 Thanks to Shree Nayar, Ravi Ramamoorthi, Pat Hanrahan.

Still Life, Louis Jaques Mande Daguerre, 1837

1558

1837

1568A Brief History of Images

Page 6: Basic Principles of Imaging and Photometry Lecture #2 Thanks to Shree Nayar, Ravi Ramamoorthi, Pat Hanrahan.

Silicon Image Detector, 1970

1558

1837

1568

1970

A Brief History of Images

Page 7: Basic Principles of Imaging and Photometry Lecture #2 Thanks to Shree Nayar, Ravi Ramamoorthi, Pat Hanrahan.

1558

1837

1568

1970

1995

A Brief History of Images

Digital Cameras

Page 8: Basic Principles of Imaging and Photometry Lecture #2 Thanks to Shree Nayar, Ravi Ramamoorthi, Pat Hanrahan.

Geometric Optics and Image Formation

TOPICS TO BE COVERED :

1) Pinhole and Perspective Projection

2) Image Formation using Lenses

3) Lens related issues

Page 9: Basic Principles of Imaging and Photometry Lecture #2 Thanks to Shree Nayar, Ravi Ramamoorthi, Pat Hanrahan.

Pinhole and the Perspective Projection

(x,y)

screen scene

Is an image being formedon the screen?

YES! But, not a “clear” one.

image plane

effective focal length, f’optical axis

y

x

z

pinhole

),,( zyxr

z

y

f

y

z

x

f

x

'

'

'

'

zf

rr

'

'

)',','(' fyxr

Page 10: Basic Principles of Imaging and Photometry Lecture #2 Thanks to Shree Nayar, Ravi Ramamoorthi, Pat Hanrahan.

Magnification

image plane

f’optical axis

y

x

zPinhole

planar scene

A

B

A’

B’

d

d’

z

yy

f

yy

z

xx

f

xx

z

y

f

y

z

x

f

x

'

''

'

''

'

'

'

'

From perspective projection: Magnification:

z

f

yx

yx

d

dm

'

)()(

)'()'('22

22

),,(

),,(

zyyxxB

zyxA

)','',''('

)',','('

fyyxxB

fyxA

2mArea

Area

scene

image

Page 11: Basic Principles of Imaging and Photometry Lecture #2 Thanks to Shree Nayar, Ravi Ramamoorthi, Pat Hanrahan.

Orthographic Projection

image plane

optical axis

y

x

z

),,( zyxr

)',','(' fyxr

zz

xmx ' ymy 'Magnification:

When m = 1, we have orthographic projection

This is possible only when zz In other words, the range of scene depths is assumed to be

much smaller than the average scene depth.

But, how do we produce non-inverted images?

Page 12: Basic Principles of Imaging and Photometry Lecture #2 Thanks to Shree Nayar, Ravi Ramamoorthi, Pat Hanrahan.

Better Approximations to Perspective Projection

Page 13: Basic Principles of Imaging and Photometry Lecture #2 Thanks to Shree Nayar, Ravi Ramamoorthi, Pat Hanrahan.

Problems with Pinholes

• Pinhole size (aperture) must be “very small” to obtain a clear image.

• However, as pinhole size is made smaller, less light is received by image plane.

• If pinhole is comparable to wavelength of incoming light, DIFFRACTION effects blur the image!

• Sharpest image is obtained when:

pinhole diameter

Example: If f’ = 50mm,

= 600nm (red),

d = 0.36mm

'2 fd

Page 14: Basic Principles of Imaging and Photometry Lecture #2 Thanks to Shree Nayar, Ravi Ramamoorthi, Pat Hanrahan.

Image Formation using Lenses

• Lenses are used to avoid problems with pinholes.

• Ideal Lens: Same projection as pinhole but gathers more light!

i o

foi

111Gaussian Lens Formula:

• f is the focal length of the lens – determines the lens’s ability to bend (refract) light

• f different from the effective focal length f’ discussed before!

P

P’

f

Page 15: Basic Principles of Imaging and Photometry Lecture #2 Thanks to Shree Nayar, Ravi Ramamoorthi, Pat Hanrahan.

Focus and Defocus

foi

111

Depth of Field: Range of object distances over which image is sufficiently well focused.i.e. Range for which blur circle is less than the resolution of the imaging sensor.

d

aperturediameter

aperture

foi

1

'

1

'

1Gaussian Law:

Blur Circle, b

)'()()'(

)'( oofo

f

fo

fii

Blur Circle Diameter : )'('

iii

db

i

'i

o

'o

Page 16: Basic Principles of Imaging and Photometry Lecture #2 Thanks to Shree Nayar, Ravi Ramamoorthi, Pat Hanrahan.

Two Lens System

• Rule : Image formed by first lens is the object for the second lens.

• Main Rays : Ray passing through focus emerges parallel to optical axis.

Ray through optical center passes un-deviated.

imageplane

lens 2 lens 1

object

intermediatevirtual image

1i

1o2i 2o2f 1f

finalimage

d

• Magnification: 1

1

2

2

o

i

o

im

Exercises: What is the combined focal length of the system? What is the combined focal length if d = 0?

Page 17: Basic Principles of Imaging and Photometry Lecture #2 Thanks to Shree Nayar, Ravi Ramamoorthi, Pat Hanrahan.

Lens related issues

Compound (Thick) Lens Vignetting

Chromatic Abberation Radial and Tangential Distortion

thickness

principal planes

nodal points

1L2L3L B

A

more light from A than B !

RFBF GF

Lens has different refractive indicesfor different wavelengths.

image plane

ideal actual

ideal actual

Page 18: Basic Principles of Imaging and Photometry Lecture #2 Thanks to Shree Nayar, Ravi Ramamoorthi, Pat Hanrahan.

Radiometry and Image Formation

• To interpret image intensities, we need to understand Radiometric Concepts and Reflectance Properties.

• TOPICS TO BE COVERED:

1) Image Intensities: Overview

2) Radiometric Concepts:

Radiant IntensityIrradianceRadianceBRDF

3) Image Formation using a Lens

4) Radiometric Camera Calibration

Page 19: Basic Principles of Imaging and Photometry Lecture #2 Thanks to Shree Nayar, Ravi Ramamoorthi, Pat Hanrahan.

Image Intensities

Image intensities = f ( normal, surface reflectance, illumination )

Note: Image intensity understanding is an under-constrained problem!

source sensor

surfaceelement

normalNeed to considerlight propagation ina cone

Page 20: Basic Principles of Imaging and Photometry Lecture #2 Thanks to Shree Nayar, Ravi Ramamoorthi, Pat Hanrahan.

Radiometric concepts – boring…but, important!

(1) Solid Angle : 22

cos'

R

dA

R

dAd i ( steradian )

What is the solid angle subtended by a hemisphere?

d (solid angle subtended by )dA

R'dA

dA

(foreshortened area)

(surface area)

i

(2) Radiant Intensity of Source :

Light Flux (power) emitted per unit solid angle

dd

J

( watts / steradian )

(3) Surface Irradiance :dA

dE

( watts / m )

Light Flux (power) incident per unit surface area.

Does not depend on where the light is coming from!

source

2

(4) Surface Radiance (tricky) :

d

ddA

dL

r )cos(

2 (watts / m steradian )

dA

r

• Flux emitted per unit foreshortened area per unit solid angle.

• L depends on direction

• Surface can radiate into whole hemisphere.

• L depends on reflectance properties of surface.

r

Page 21: Basic Principles of Imaging and Photometry Lecture #2 Thanks to Shree Nayar, Ravi Ramamoorthi, Pat Hanrahan.

The Fundamental Assumption in Vision

Surface Camera

No Change in

Radiance

Lighting

Page 22: Basic Principles of Imaging and Photometry Lecture #2 Thanks to Shree Nayar, Ravi Ramamoorthi, Pat Hanrahan.

Radiance properties

• Radiance is constant as it propagates along ray

– Derived from conservation of flux

– Fundamental in Light Transport.

1 21 1 1 2 2 2d L d dA L d dA d

2 21 2 2 1d dA r d dA r

1 21 1 2 22

dAdAd dA d dA

r

1 2L L

Page 23: Basic Principles of Imaging and Photometry Lecture #2 Thanks to Shree Nayar, Ravi Ramamoorthi, Pat Hanrahan.

SceneRadiance L Lens Image

Irradiance E

CameraElectronics

Scene

ImageIrradiance E

Relationship between Scene and Image Brightness

Measured Pixel Values, I

Non-linear Mapping!

Linear Mapping!

• Before light hits the image plane:

• After light hits the image plane:

Can we go from measured pixel value, I, to scene radiance, L?

Page 24: Basic Principles of Imaging and Photometry Lecture #2 Thanks to Shree Nayar, Ravi Ramamoorthi, Pat Hanrahan.

Relation between Image Irradiance E and Scene Radiance L

f z

surface patchimage plane

id

sd

Ld

idA

sdA

image patch

si dd 22 )cos/(

cos

)cos/(

cos

z

dA

f

dA si 2

cos

cos

f

z

dA

dA

i

s

• Solid angles of the double cone (orange and green):

(1)

2

2

)cos/(

cos

4

z

dd L

• Solid angle subtended by lens:

(2)

Page 25: Basic Principles of Imaging and Photometry Lecture #2 Thanks to Shree Nayar, Ravi Ramamoorthi, Pat Hanrahan.

Relation between Image Irradiance E and Scene Radiance L

f z

surface patchimage plane

id

sd

Ld

idA

sdA

image patch

• Flux received by lens from = Flux projected onto image sdA idA

iLs dAEddAL )cos( (3)

• From (1), (2), and (3): 4

2

cos4

f

dLE

• Image irradiance is proportional to Scene Radiance!

• Small field of view Effects of 4th power of cosine are small.

Page 26: Basic Principles of Imaging and Photometry Lecture #2 Thanks to Shree Nayar, Ravi Ramamoorthi, Pat Hanrahan.

Relation between Pixel Values I and Image Irradiance E

• The camera response function relates image irradiance at the image plane to the measured pixel intensity values.

CameraElectronics

ImageIrradiance E

Measured Pixel Values, I

IEg :

(Grossberg and Nayar)

Page 27: Basic Principles of Imaging and Photometry Lecture #2 Thanks to Shree Nayar, Ravi Ramamoorthi, Pat Hanrahan.

Radiometric Calibration

•Important preprocessing step for many vision and graphics algorithms such as photometric stereo, invariants, de-weathering, inverse rendering, image based rendering, etc.

EIg :1

•Use a color chart with precisely known reflectances.

Irradiance = const * ReflectanceP

ixel V

alu

es

3.1%9.0%19.8%36.2%59.1%90%

• Use more camera exposures to fill up the curve.• Method assumes constant lighting on all patches and works best when source is far away (example sunlight).

• Unique inverse exists because g is monotonic and smooth for all cameras.

0

255

0 1

g

?

?1g

Page 28: Basic Principles of Imaging and Photometry Lecture #2 Thanks to Shree Nayar, Ravi Ramamoorthi, Pat Hanrahan.

The Problem of Dynamic Range

• Dynamic Range: Range of brightness values measurable with a camera

(Hood 1986)

High Exposure Image Low Exposure Image

• We need 5-10 million values to store all brightnesses around us.• But, typical 8-bit cameras provide only 256 values!!

• Today’s Cameras: Limited Dynamic Range

Page 29: Basic Principles of Imaging and Photometry Lecture #2 Thanks to Shree Nayar, Ravi Ramamoorthi, Pat Hanrahan.

Images taken with a fish-eye lens of the sky show the wide range of brightnesses.

High Dynamic Range Imaging

• Capture a lot of images with different exposure settings.

• Apply radiometric calibration to each camera.

• Combine the calibrated images (for example, using averaging weighted by exposures).

(Debevec)

(Mitsunaga)