Geometric and Radiometric Camera Calibration

10
Geometric and Radiometric Camera Calibration Shape From Stereo requires geometric knowledge of: Cameras’ extrinsic parameters, i.e. the geometric relationship between the two cameras. Camera intrinsic parameters, i.e. each camera’s internal geometry (e.g. Focal Length) and lens distortion effects. Shape From Shading requires radiometric knowledge of: Camera detector uniformity (e.g. Flat-field images) Camera detector temperature noise (e.g. Dark CSM4220 1

description

CSM4220. Geometric and Radiometric Camera Calibration. Shape From Stereo requires geometric knowledge of: Cameras’ extrinsic parameters, i.e. the geometric relationship between the two cameras. - PowerPoint PPT Presentation

Transcript of Geometric and Radiometric Camera Calibration

Page 1: Geometric and Radiometric Camera Calibration

Geometric and RadiometricCamera Calibration

• Shape From Stereo requires geometric knowledge of:– Cameras’ extrinsic parameters, i.e. the geometric relationship

between the two cameras.– Camera intrinsic parameters, i.e. each camera’s internal

geometry (e.g. Focal Length) and lens distortion effects.• Shape From Shading requires radiometric knowledge of:

– Camera detector uniformity (e.g. Flat-field images)– Camera detector temperature noise (e.g. Dark frame images)– Camera detector bad pixels– Camera Digital Number (DN) to radiance transfer function

CSM4220 1

Page 2: Geometric and Radiometric Camera Calibration

Camera Geometric Calibration

• This has been a computer vision research topic for many years (hence many papers)

• There are a number of available software tools to assist with the calibration process

• Examples include:– Matlab Camera Calibration Toolbox– Open CV Camera Calibration and 3D Reconstruction

• These often require a geometric calibration target – often a 2D checkerboard

2CSM4220

Page 3: Geometric and Radiometric Camera Calibration

Stereo Vision – calibration

3

A sequence of left and right camera images of a 16 × 16 square checker-board used as part of the intrinsic and extrinsic calibration procedure for the AU stereo WAC cameras.

Left Camera Images Right Camera Images

CSM4220

Page 4: Geometric and Radiometric Camera Calibration

Stereo Vision – extrinsic calibration

4

Camera Baseline separation, and relative orientationCSM4220

Page 5: Geometric and Radiometric Camera Calibration

Stereo Vision – intrinsic calibration

5CSM4220

Page 6: Geometric and Radiometric Camera Calibration

Stereo Vision – image rectification

6CSM4220

Using the geometric calibration results, an image rectification algorithm is used to project two-or-more images onto a common image plane. It corrects image distortion by transforming the image into a standard coordinate system

Image rectification illustrates how imagerectification simplifies the search space instereo correlation matching.(Image courtesy Bart van Andel)

See Fusiello et al., A Compact algorithmfor rectification of stereo pairs, Machine Vision Applications, 12, 16-22,2000

Page 7: Geometric and Radiometric Camera Calibration

7

Left rectified image Right rectified image←1024768↓

Once rectified, a disparity algorithm searches along the rows to identify a pixel’s location in the right image relative to the left image. This pixel distance is often grey-scale coded (0 to 255) and shown as an image of the disparity map. Using epipolar geometry the 3D position of a pixel can be calculated (using triangulation).

CSM4220

Stereo Vision – disparity maps

Page 8: Geometric and Radiometric Camera Calibration

Stereo Vision – epipolar geometry and disparity

8CSM4220

(Images courtesy Colorado School of Mines)

b = camera baseline separationf = camera focal lengthV1 and V2 = horizontal placement of pixel points relative to camera centre (C)d = V1 – V2 = disparityD = Distance of point in real world

yz

x

Eqn. derived from epipolargeometry above:

D = b × fd

Page 9: Geometric and Radiometric Camera Calibration

Stereo Vision – disparity map example

9CSM4220

Zitnick-Kanade Stereo Algorithm Example:(See Experiments in Stereo Vision web page)

Left image Right image Grey scaled disparity map

Given camera geometry relative to the scene, then lighter pixels havegreater disparity (nearer to cameras), whereas darker pixels have lessdisparity (further from cameras). D inversely proportional to d.

Page 10: Geometric and Radiometric Camera Calibration

Stereo Vision – disparity to depth-maps

10CSM4220

Grey scaleddisparity map

Using slide 8 eqn. the real-world x, y, z value for each pixel in the disparity map can be calculated relative to the camera origin. This can be regarded as an absolute depth-map. Just using the disparity map alone provides a relative depth-map.

A mesh can be fitted to the 3D data points(compare laser scanner ‘point cloud’).Note errors due to disparity algorithm, and8-bit grey-scale data. (GLView image above)

3D terrain models are referred to asheight-maps, or Digital Elevation Models(DEM), or Digital Terrain Models (DTM).