ECE661 Computer Vision : HW5 - College of … · ECE661 Computer Vision : HW5 ... Reflne all...

68
ECE661 Computer Vision : HW5 Kihyun Hong 1 Problem Description In this homework assignment, our task is to calibrate a camera. To do this, first detect corner points of captured calibration pattern images. Using these corner points calculate homographies between the images and the model. From the homographies, initially obtain intrinsic and extrinsic parameters. Then, measure the radial distortion of the camera. After getting all these camera parameters from a direct method, finally refine this parameters using non-linear optimization methods such as LM algorithm. 1.1 Corner Detection With Index In this homework, we used camera a calibration pattern. Calibration image has repeated square patterns. So, to calculate the homography between captured patterns and model, it is preferable to use labels (index) of correspondences. To label corners of the pattern, first we found lines of pattern and labeled the cross points of the lines as corners. We utilized Canny edge detection and Hough transform to obtain lines of the patterns. We implemented labeled corner detection as follows 1. Edge detection. 2. Line fitting (Hough transform). 3. Line ordering. 4. Corner detection with label (using line crossing points). 5. (Option) Labeling true corners (find true corner using Harris corner detection, and search the nearest indexed corner, then label the true corner with the index.) 1.2 Camera Calibration In this homework, camera calibration is mainly followed by Zhang ’s paper [1]. 1. Compute the homographies (DLT) between the images and the model using the labeled corners(feature points). 1

Transcript of ECE661 Computer Vision : HW5 - College of … · ECE661 Computer Vision : HW5 ... Reflne all...

ECE661 Computer Vision : HW5

Kihyun Hong

1 Problem Description

In this homework assignment, our task is to calibrate a camera. To do this, first detectcorner points of captured calibration pattern images. Using these corner points calculatehomographies between the images and the model. From the homographies, initially obtainintrinsic and extrinsic parameters. Then, measure the radial distortion of the camera. Aftergetting all these camera parameters from a direct method, finally refine this parameters usingnon-linear optimization methods such as LM algorithm.

1.1 Corner Detection With Index

In this homework, we used camera a calibration pattern. Calibration image has repeatedsquare patterns. So, to calculate the homography between captured patterns and model, itis preferable to use labels (index) of correspondences. To label corners of the pattern, firstwe found lines of pattern and labeled the cross points of the lines as corners. We utilizedCanny edge detection and Hough transform to obtain lines of the patterns. We implementedlabeled corner detection as follows

1. Edge detection.

2. Line fitting (Hough transform).

3. Line ordering.

4. Corner detection with label (using line crossing points).

5. (Option) Labeling true corners (find true corner using Harris corner detection, andsearch the nearest indexed corner, then label the true corner with the index.)

1.2 Camera Calibration

In this homework, camera calibration is mainly followed by Zhang ’s paper [1].

1. Compute the homographies (DLT) between the images and the model using the labeledcorners(feature points).

1

2. Calculate the intrinsic matrix (five intrinsic parameters : αx, αy, x0, y0, skew)

3. Estimate the extrinsic parameters (rotation and translation)

4. Compute the radial distortion

5. Refine all parameters using Levenberg-Marquardt (LM ) algorithm

We estimated the homographies between the images and the model with DLT methoddescribed in homework 4. From the homographies, we computed the intrinsic matrix, K.The relation between homography and intrinsic parameters can be expressed as

hTi ωhj ≡ vT

ijb (1)

where hi is the i-th column of homography H, ω is (KKT )−1 , and vij and b are describedin [1].

With the two constraints, h1ωh2 = 0 and h1ωh1 = h2ωh2, and eq.(1), we get a followinglinear equation for one image homography.

Vib = 0 (2)

where Vi = [v12 (v11 − v22)]T .

Using n images, we can construct an overdetermined linear equation V b = 0 whereV = [V1V2...Vn]T . From the equation, we can calculate b by solving

min ‖V b‖, s.t. ‖b‖ = 1. (3)

When we get the optimal b, we can compute the intrinsic parameters as described in [1].

With the estimated intrinsic parameters, we can compute the extrinsic parameters.

r1 = ξK−1h3 (4)

r2 = ξK−1h3 (5)

r3 = r1 × r2 (6)

t = ξK−1h3 (7)

where ξ is the normalization factor. Here, since the rotation matrix R = [r1r2r3] shouldbe orthonormal, we refine the rotation matrix by solving

min ‖R−Q‖F , s.t. RT R = I (8)

where Q is the initially estimated rotation matrix. The solution R is UV T if Q = UΣV T .

After we get all intrinsic and extrinsic parameters, we also estimate the radial distortion(k0 and k1) using least square estimation of radial distortion model.

These initial parameters can be refined by non-linear optimization method.

arg minK,Ri,ti,k0,k1

n∑i=1

m∑j=1

‖xi,j − x̂i,j(K, Ri, ti, k0, k1, Xj)‖ (9)

2

where i is the image index, j is the point index of images, x(i, j) is the feature j-th point ofi-thimage, x̂ is the projection of the model point X. To solve this minimization problem, weused LM algorithm described in Pradit ’s handout [2].

2 Results

Using Cannon PowerShot S 500 camera, we captured 4 images (600 by 800) of thecalibration pattern to calibrate the camera (fig.1). The results of Canny edge detection, linefitting of the edges, and labeled corners are shown in fig.2, 3, and 4.

In fig.5, the model corner points projected into the image using estimated camera parame-ters and in fig.6, radial distortion is considered after the projection. The directly estimatedvalues of the camera parameters are shown in table 1. After we calculated the initial cameraparameters, refinement was performed (LM algorithm). In the programming, we use Matlabsymbolic function jacobian to calculate Jacobian of the parameters. In refinement, we ap-plied 4 images simultaneously into LM optimization. The final result is shown in fig.5 andtable 2. Finally, fig.8 shows radial distortion free images.

squared error / pixel 3.26956intrinsic parameters

αx 839.838867αy 841.434021x0 290.961853y0 400.418579

skew 1.203523k1 0.095307k2 -0.694225

extrinsic parametersimage 1 image 2

r1 (0.971027, 0.087594, 0.222338) (0.993763, 0.039597, 0.104247)r2 (-0.079088, 0.995764, -0.046895) (-0.040510, 0.999157, 0.006662)r3 (-0.225504, 0.027952, 0.973841) (-0.103895, -0.010843, 0.994529)t (-2.613769, -5.361748, 12.389180) (-3.055137, -4.467215, 12.202751)

image 3 image 4r1 (0.954073, 0.143081, -0.263196) (0.906400, 0.054807, 0.418849)r2 (-0.094177, 0.977278, 0.189893) (-0.053925, 0.998447, -0.013951)r3 (0.284385, -0.156385, 0.945869) (-0.418963, -0.009941, 0.907949)t (-3.006199, -4.996442, 13.017076) (-2.388240, -4.559933, 10.244861)

Table 1: Initial results (before refinement).

3

Figure 1: 4 captured pattern images.

4

Figure 2: Binary edge image of an image.

Figure 3: Line fitting result of the edge.

5

Figure 4: Detected indexed corners using ordered lines.

squared error / pixel 3.219432intrinsic parameters

αx 839.839111αy 841.434448x0 290.960785y0 400.417572

skew 1.203914k1 0.098761k2 -0.693423

extrinsic parametersimage 1 image 2

r1 (0.971128, 0.087141, 0.222072) (0.993784, 0.039414, 0.104116)r2 (-0.078566, 0.995792, -0.047179) (-0.040309, 0.999166, 0.006503)r3 (-0.225249, 0.028369, 0.973888) (-0.103773, -0.010659, 0.994544)t (-2.620482, -5.363396, 12.388030) (-3.055003, -4.467087, 12.202749)

image 3 image 4r1 (0.954101, 0.142828, -0.263232) (0.906532, 0.054670, 0.418581)r2 (-0.093927, 0.977312, 0.189837) (-0.053626, 0.998459, -0.014267)r3 (0.284374, -0.156399, 0.945871) (-0.418716, -0.009514, 0.908067)t (-3.006068, -4.996318, 13.017078) (-2.388093, -4.559793, 10.244858)

Table 2: Final results (after refinement).

6

Figure 5: Projected model corner points into the image using directly estimated cameraparameters.

7

Figure 6: Projected model corner points with radial distortion parameters.

8

Figure 7: Projected model corner points into the image using refined camera parameters.

9

Figure 8: Radial distortion corrected images.

10

References

[1] Zhengyou Zhang, “A Flexible New Technique for Camera Calibration”, Technical ReportMSR-TR-98-71, Microsoft Co.

[2] http://web.ics.purdue.edu/~mitrapiy/ECE661/HW5_LM_handout.pdf

11

3 Codes

The codes are composed of 6 parts which are HW5.cpp, corner.cpp, homography.cpp,calibration.cpp, jacobianCalculation.cpp, and utility.cpp. HW5.cpp is the main functionfile. In calibration.cpp, intrinsic and extrinsic camera parameter estimation methods andparameter refinement are implemented. The functions for indexed corner detection algorithmare described in corner.cpp. homography.cpp contains mainly homography estimation. Someimage handling tools are contained in utility.cpp. And opencv library is used to implementmost of the functions.

3.1 Main Function File : HW5.cpp

//// file : HW5.cpp// ---------------------------//

#include <stdlib.h>#include <stdio.h>#include <math.h>#include "cv.h"#include "cxcore.h"#include "highgui.h"#include "utility.h"#include "corner.h"#include "homography.h"#include "calibration.h"

void MarkCornerPoints(IplImage *image, Points *cornerMap);

int main(int argc, char **argv) {// declarationIplImage *inImage = 0;IplImage *colorImage = 0;PairPoints corspMap;Points cornerMap;CameraParam cp;HomographySet h;PairPointSet imagePt;char inImageName[80];char imageNamePrefix[80];CvMat *H = cvCreateMat(3, 3, CV_32FC1); // homographyCvMat *K = cvCreateMat(3, 3, CV_32FC1); // intrinsic matrixint i, j;int numOfImages, imageNumber;int numPointsRow = NUM_H_LINES;int numPointsColumn = NUM_V_LINES;

12

int pointOrder;

if(argc == 3){strcpy(imageNamePrefix, argv[1]);numOfImages = atoi(argv[2]);

}else{printf("\n");printf(" Usage: hw5 imageNamePrefix [number of images]\n");printf("\n");

}

// set struct parametersh.len = numOfImages;h.numPointsRow = numPointsRow;h.numPointsColumn = numPointsColumn;corspMap.len = numPointsRow * numPointsColumn;cornerMap.len = numPointsRow * numPointsColumn;imagePt.imageLen = numOfImages;imagePt.pointLen = numPointsRow * numPointsColumn;cp.len = numOfImages;

for(imageNumber = 0; imageNumber < numOfImages; imageNumber++){sprintf(inImageName, "../image/%s%d.pgm", imageNamePrefix, imageNumber);inImage = cvLoadImage(inImageName, -1);if(!inImage){

printf("Could not load image file: %s\n", inImageName);exit(0);

}

////////////////////////////////////////////////////////////// 1. Corner Detection //// find indexed corner using Hough xform //////////////////////////////////////////////////////////////IndexedCornerDetect(inImage, &cornerMap);

// display and store the result imagecolorImage = cvCreateImage(cvGetSize(inImage), IPL_DEPTH_8U, 3);cvCvtColor(inImage, colorImage, CV_GRAY2BGR);MarkCornerPoints(colorImage, &cornerMap);cvNamedWindow("output image", CV_WINDOW_AUTOSIZE);cvShowImage("output image", colorImage);cvWaitKey(0);cvDestroyWindow("output image");WriteImage(colorImage, "indexedcorner.jpg");////////////////////////////////////////////////////////////// 2. Homography Estimation //// calculate the homoraphy using DLT(SVD) //

13

////////////////////////////////////////////////////////////// create the correspondence map for calculating Hfor(i = 0; i < numPointsRow; i++){

for(j = 0; j < numPointsColumn; j++){pointOrder = i*numPointsColumn + j;// 1 square length = unit 1// so, if actual 1 square length = a// we need scale unit with a, after calibration.corspMap.pt1I[pointOrder] = i;corspMap.pt1J[pointOrder] = j;corspMap.pt2I[pointOrder] = cornerMap.ptI[pointOrder];corspMap.pt2J[pointOrder] = cornerMap.ptJ[pointOrder];// store input corner points for radial distortion calculationimagePt.ptmI[pointOrder][imageNumber] = i;imagePt.ptmJ[pointOrder][imageNumber] = j;imagePt.ptiI[pointOrder][imageNumber] = cornerMap.ptI[pointOrder];imagePt.ptiJ[pointOrder][imageNumber] = cornerMap.ptJ[pointOrder];

}}

// estimate correspondencesHomograhyEstimation(&corspMap, H);

// store the estimated homographyfloat H33 = cvmGet(H, 2, 2);// normalize homography : however, this step is not requiredfor(int n = 0; n < 3; n++){

for(int m = 0; m < 3; m++){h.H[n][m][imageNumber] = cvmGet(H, n, m) / H33;

}}

}

////////////////////////////////////////////////////////////// 3. Camera Calibration //// calculate intrinsic and extrinsic parameters //////////////////////////////////////////////////////////////CameraCalibration(&h, &cp, &imagePt);

////////////////////////////////////////////////////////////// 5. Results //////////////////////////////////////////////////////////////IplImage *outImage = cvCreateImage(cvGetSize(inImage), IPL_DEPTH_8U, 1);

for(imageNumber = 0; imageNumber < numOfImages; imageNumber++){sprintf(inImageName, "../image/%s%d.pgm", imageNamePrefix, imageNumber);inImage = cvLoadImage(inImageName, -1);

14

if(!inImage){printf("Could not load image file: %s\n", inImageName);exit(0);

}//H = KA, where A = [r1 r2 t];// get intrinsic matrix Kfloat kArr[9] = {cp.alphaX, cp.skew, cp.x0,

0.f , cp.alphaY, cp.y0,0.f , 0.f , 1};

Array2CvMat(kArr, K, 3, 3);

// get undistorted imagesfloat distCoeffs[4] = {cp.k1, cp.k2, 0.f, 0.f};cvUnDistortOnce(inImage, outImage, kArr, distCoeffs, 1);

CvMat *A = cvCreateMat(3, 3, CV_32FC1);for(i = 0; i < 3; i++){

cvmSet(A, i, 0, cp.r1[i][imageNumber]);cvmSet(A, i, 1, cp.r2[i][imageNumber]);cvmSet(A, i, 2, cp.t[i][imageNumber]);

}cvMatMul(K, A, H);

float h[9];CvMat2Array(H, h, 3, 3);

int height = inImage->height;int width = inImage->width;for(i = 0; i < numPointsRow; i++){

for(j = 0; j < numPointsColumn; j++){pointOrder = i*numPointsColumn + j;float x1 = h[0] * j + h[1] * i + h[2];float x2 = h[3] * j + h[4] * i + h[5];float x3 = h[6] * j + h[7] * i + h[8];float vi = min(height - 1, max(0, x2 / x3));float ui = min(width - 1, max(0, x1 / x3));

// radial distortionfloat tpU = (ui - cp.x0);float tpV = (vi - cp.y0);float r = pow(tpU / cp.alphaX , 2) + pow(tpV / cp.alphaY, 2);

cornerMap.ptJ[pointOrder] = cvRound(ui + tpU *(cp.k1 * r + cp.k2 * pow(r, 2)));

cornerMap.ptI[pointOrder] = cvRound(vi + tpV *(cp.k1 * r + cp.k2 * pow(r, 2)));

15

cvCircle(inImage, cvPoint(cornerMap.ptJ[pointOrder],cornerMap.ptI[pointOrder]), 2, cvScalar(0, 255, 0), 3);

}}

WriteImage(inImage, "result.jpg");WriteImage(outImage, "distortionCorrResult.jpg");cvNamedWindow("output image", CV_WINDOW_AUTOSIZE);cvShowImage("output image", inImage);cvWaitKey(0);cvDestroyWindow("output image");

cvNamedWindow("output image", CV_WINDOW_AUTOSIZE);cvShowImage("output image", outImage);cvWaitKey(0);cvDestroyWindow("output image");cvReleaseMat(&A);

}

// release the images a1nd matrixcvReleaseImage(&inImage);cvReleaseImage(&colorImage);cvReleaseMat(&H);cvReleaseMat(&K);

return 0;}

//// function : MarkCornerPoints// usage : MarkCornerPoints(image, cornerMap);// -------------------------------------------------// This function draws marks in input image corresponding// to the corner map.//void MarkCornerPoints(IplImage *image, Points *cornerMap) {

int i, j, k;

char label[10];CvFont font;double hScale = .5;

16

double vScale = .5;int lineWidth = 2;

cvInitFont(&font, CV_FONT_HERSHEY_SIMPLEX|CV_FONT_ITALIC,hScale, vScale, 0, lineWidth);

for(k = 0; k < cornerMap->len; k++){i = cornerMap->ptI[k];j = cornerMap->ptJ[k];cvCircle(image, cvPoint(j, i), 3, cvScalar(0, 255, 0), 3);sprintf(label, "%d", k);cvPutText (image, label, cvPoint(j, i), &font, cvScalar(0, 0, 255));

}}

3.2 Program Header File : corner.h

//// file : corner.h//-----------------------------// this file contains functions for corner// detection algorithms//

/////////////////////////////////////////////// indexed corner detection parameters ///////////////////////////////////////////////// canny edgy detection parameters#define APERTURE_SIZE 3#define LOW_THRESHOLD 50#define HIGH_THRESHOLD 150

#define LINE_DEVIATION_THRESHOLD 50#define SEARCH_RANGE 32// search range for true corners

#define NUM_H_LINES 10#define NUM_V_LINES 8

void IndexedCornerDetect(IplImage *inImage, Points *cornerMap);

void Get2LinePoints(CvSeq *lines, PairPoints *hLinePoints, PairPoints*vLinePoints,

int imageWidth, int imageHeight);

17

void SortingLines(PairPoints *linePoints);

void GroupingLines(PairPoints *linePoints, PairPoints *newLinePoints, intthreshold);

void EstimateIndexedCorner(PairPoints *hLinePoints, PairPoints *vLinePoints,Points *cornerMap);

void RefineIndexedCorner(IplImage *inImage, Points *tempCornerMap,Points *cornerMap);

/////////////////////////////////////////////// Harris corner detection parameters ///////////////////////////////////////////////// parameters for window sum (sum fxfx, ...)#define WINDOW_OPTION CV_BLUR//#define WINDOW_OPTION CV_GAUSSIAN#define PARAM1 3#define PARAM2 3#define RANGE_NEIGHBOR 6#define HARRIS_THRESHOLD 500

void HarrisCornerDectect(IplImage *inImage, IplImage *cornerMap,float threshold);

void GetffImage(IplImage *inImage, IplImage *outImage,CvMat *kernel1st, CvMat *kernel2nd, CvPoint anchor,int windowOption, int param1, int param2);

void FindLocalMaxPoint(IplImage *inImage, IplImage *map, int range);

3.3 Program Function File : corner.cpp

//// file : corner.cpp//-----------------------------// this file contains functions for corner// detection//

#include <stdlib.h>#include <stdio.h>#include <math.h>#include "cv.h"#include "cxcore.h"#include "highgui.h"

18

#include "utility.h"#include "corner.h"

void HarrisCornerDectect(IplImage *inImage,IplImage *cornerImage, float threshold)

{IplImage *tempImage = 0;IplImage *fxfxImage = 0, *fxfyImage = 0, *fyfyImage = 0;IplImage *eigenvalueImage = 0;float *eigenData = 0;float *fxfxData = 0, *fxfyData = 0, *fyfyData = 0;float fxfx, fxfy, fyfy;float slambda, lambda1, lambda2, lambda3;int i, j, k;int height, width, channels, step;int eigStep;CvPoint anchor;

float dx[9] = {-1, 0, 1, -1, 0, 1, -1, 0, 1}; // derivative masksfloat dy[9] = {1, 1, 1, 0, 0, 0, -1, -1, -1}; // derivative masksfloat sMask[9] = {1, 1, 1, 1, 1, 1, 1, 1, 1}; // window sum mask

CvMat *Dx = cvCreateMat(3, 3, CV_32FC1); // derivative maskCvMat *Dy = cvCreateMat(3, 3, CV_32FC1); // derivative maskCvMat *window = cvCreateMat(3, 3, CV_32FC1); // window sum maskCvMat *G = cvCreateMat(2, 2, CV_32FC1); // Harris MatrixCvMat *q = cvCreateMat(2, 2, CV_32FC1); // eigen vector of GCvMat *lambda = cvCreateMat(2, 1, CV_32FC1); // eigenvalue of G

// assign predefined values to matricesArray2CvMat(dx, Dx, 3, 3);Array2CvMat(dy, Dy, 3, 3);Array2CvMat(sMask, window, 3, 3);

height = inImage->height;width = inImage->width;channels = inImage->nChannels;step = inImage->widthStep;eigStep = cornerImage->widthStep;

// create the processing imagetempImage = cvCreateImage(cvSize(width, height), IPL_DEPTH_8U, channels);fxfxImage = cvCreateImage(cvSize(width, height), IPL_DEPTH_32F, channels);fxfyImage = cvCreateImage(cvSize(width, height), IPL_DEPTH_32F, channels);fyfyImage = cvCreateImage(cvSize(width, height), IPL_DEPTH_32F, channels);eigenvalueImage = cvCreateImage(cvSize(width, height), IPL_DEPTH_32F, 1);

19

fxfxData = (float *)fxfxImage->imageData;fxfyData = (float *)fxfyImage->imageData;fyfyData = (float *)fyfyImage->imageData;eigenData = (float *)eigenvalueImage->imageData;

// LPF filtering to reduce noisecvSmooth(inImage, tempImage, CV_GAUSSIAN, 3, 0, 0);

// get fxfx, fxfy, fyfy imagesanchor = cvPoint(0, 0);GetffImage(tempImage, fxfxImage, Dx, Dx, anchor, WINDOW_OPTION, PARAM1, PARAM2);GetffImage(tempImage, fyfyImage, Dy, Dy, anchor, WINDOW_OPTION, PARAM1, PARAM2);GetffImage(tempImage, fxfyImage, Dx, Dy, anchor, WINDOW_OPTION, PARAM1, PARAM2);cvReleaseImage(&tempImage);// every fxfx, fxfy, and fyfy image pixel is summed by window// to construct the matrix// [window sum (fxfx) window sum (fxfy)]// [window sum (fxfy) window sum (fyfy)]

// find small eigenvalues for each pixelfor(i = 0; i < height; i++){

for(j = 0; j < width; j++){for(k = 0; k < channels; k++){

fxfx = fxfxData[i*step + j*channels + k];fxfy = fxfyData[i*step + j*channels + k];fyfy = fyfyData[i*step + j*channels + k];

// set matrix G = [window sum (fxfx) window sum (fxfy)]// [window sum (fxfy) window sum (fyfy)]cvmSet(G, 0, 0, fxfx);cvmSet(G, 0, 1, fxfy);cvmSet(G, 1, 0, fxfy);cvmSet(G, 1, 1, fyfy);

// eigen value decomapStepmpositioncvEigenVV(G, q, lambda);// lambda = eigenvalues of G (descending order)// q = corresponding orthogonal eigenvectors (rows)if(channels == 3){

if(k == 0)lambda1 = cvmGet(lambda, 1, 0); // lambda for B

else if(k == 1)lambda2 = cvmGet(lambda, 1, 0); // lambda for G

elselambda3 = cvmGet(lambda, 1, 0); // lambda for R

}else{ // channels == 1lambda1 = cvmGet(lambda, 1, 0);

20

}}

if(channels == 3){slambda = pow(pow(lambda1, 2) + pow(lambda1, 2) + pow(lambda1, 2), .5);

}else{slambda = lambda1;

}// store the small eigen values that are normalized by thresholdeigenData[i*eigStep + j] = slambda / threshold;

}}

// fine local maximum corner pointsFindLocalMaxPoint(eigenvalueImage, cornerImage, RANGE_NEIGHBOR);

// release imagescvReleaseImage(&tempImage);cvReleaseImage(&fxfxImage);cvReleaseImage(&fxfyImage);cvReleaseImage(&fyfyImage);cvReleaseImage(&eigenvalueImage);// release matricescvReleaseMat(&Dx); cvReleaseMat(&Dy);cvReleaseMat(&window);cvReleaseMat(&G); cvReleaseMat(&q); cvReleaseMat(&lambda);

}

void GetffImage(IplImage *inImage, IplImage *outImage,CvMat *kernel1st, CvMat *kernel2nd, CvPoint anchor,int windowOption, int param1, int param2)

{IplImage *f1Image = 0, *f2Image = 0, *tempImage = 0;float *f1Data = 0, *f2Data = 0;float *outData;int height, width, step, channels;int i, j, k;

// create the output imageheight = inImage->height;width = inImage->width;channels = inImage->nChannels;step = inImage->widthStep;

f1Image = cvCreateImage(cvSize(width, height), IPL_DEPTH_32F, channels);

21

f2Image = cvCreateImage(cvSize(width, height), IPL_DEPTH_32F, channels);tempImage = cvCreateImage(cvSize(width, height), IPL_DEPTH_32F, channels);

f1Data = (float *)f1Image->imageData;f2Data = (float *)f2Image->imageData;outData = (float *)outImage->imageData;

// copy input image to float precision imageCvImageCopyUchar2Float(inImage, tempImage);

cvFilter2D(tempImage, f1Image, kernel1st, anchor);cvFilter2D(tempImage, f2Image, kernel2nd, anchor);

for(i = 0; i < height; i++){for(j = 0; j < width; j++){

for(k = 0; k < channels; k++){outData[i*step + j*channels + k]= f1Data[i*step + j*channels + k]* f2Data[i*step + j*channels + k];

}}

}

// window sum of fxfx, fxfy, or fyfycvCopyImage (outImage, tempImage);cvSmooth(tempImage, outImage, windowOption, param1, param2);

cvReleaseImage(&tempImage);cvReleaseImage(&f1Image);cvReleaseImage(&f2Image);

}

void FindLocalMaxPoint(IplImage *inImage, IplImage *map, int range) {int r, sum, numOfNeighbor;int i, j, ii, jj, posI, posJ;float current;float *inData = 0;uchar *mapData = 0;

int height = inImage->height;int width = inImage->width;int step = map->widthStep;

r = range / 2;numOfNeighbor = (2*r + 1) * (2*r + 1);inData = (float *)inImage->imageData;

22

mapData = (uchar *)map->imageData;

for(i = 0; i < height; i++){for(j = 0; j < width; j++){// mark the corner on image// write the corner positioncurrent = inData[i*step + j];

if(current < 1){ // lambda < thresholdmapData[i*step + j] = false;

}else{

// check neighborssum = 0;for(ii = -r; ii <= r; ii++){

for(jj = -r; jj <= r; jj++){posI = min(max((i+ii), 0), height - 1);posJ = min(max((j+jj), 0), width - 1);sum += (current >= inData[posI*step + posJ]);

}}

// if current pixel is maximum in its neighbors// sum == numOfNeighborif(sum == numOfNeighbor)

mapData[i*step + j] = true;else

mapData[i*step + j] = false;}

}}

}

void IndexedCornerDetect(IplImage *inImage, Points *cornerMap) {int largeNum = 50;int height, width;int i, j, k;int thresholdV = LINE_DEVIATION_THRESHOLD;int thresholdH = LINE_DEVIATION_THRESHOLD;IplImage *edgeImage = 0, *colorImage = 0;CvMemStorage *storage = cvCreateMemStorage(0);CvSeq *lines = 0;PairPoints hLinePoints, vLinePoints;PairPoints newHLinePoints, newVLinePoints;Points tempCornerMap;

23

height = inImage->height;width = inImage->width;

edgeImage = cvCreateImage(cvSize(width, height), IPL_DEPTH_8U, 1);colorImage = cvCreateImage(cvSize(width, height), IPL_DEPTH_8U, 3);

// Canny edge detectioncvCanny(inImage, edgeImage, LOW_THRESHOLD, HIGH_THRESHOLD, APERTURE_SIZE);k = 0;while(true){

// Hough Transformlines = cvHoughLines2(edgeImage, storage, CV_HOUGH_STANDARD,

1, CV_PI/180, 40, 0, 0 );

// get 2 points for every lineGet2LinePoints(lines, &hLinePoints, &vLinePoints, width, height);

// sort lines in ascendant orderSortingLines(&hLinePoints);SortingLines(&vLinePoints);

// group lines and represent a line group with mean value of linesGroupingLines(&hLinePoints, &newHLinePoints, thresholdH);GroupingLines(&vLinePoints, &newVLinePoints, thresholdV);

if(newHLinePoints.len == NUM_H_LINES&& newVLinePoints.len == NUM_V_LINES) break;

if(newHLinePoints.len != NUM_H_LINES)thresholdH += newHLinePoints.len - NUM_H_LINES;

if(newVLinePoints.len != NUM_V_LINES)thresholdV += newVLinePoints.len - NUM_V_LINES;

if(k == largeNum){printf("error in line fitting : image is not suitable!!\n");exit(0);

}k++;

}

// estimate indexed corners with line crossingsEstimateIndexedCorner(&newHLinePoints, &newVLinePoints, &tempCornerMap);// refine indexed corners(corner map) with true corners// true corners are obtained using Harris corner detectionRefineIndexedCorner(inImage, &tempCornerMap, cornerMap);

24

/////////////////////////////////////////////////////////////////////////// draw and store imagescvCvtColor(inImage, colorImage, CV_GRAY2BGR);for(i = 0; i < newHLinePoints.len; i++){

CvPoint pt1, pt2;pt1.x = newHLinePoints.pt1J[i];pt1.y = newHLinePoints.pt1I[i];pt2.x = newHLinePoints.pt2J[i];pt2.y = newHLinePoints.pt2I[i];cvLine(colorImage, pt1, pt2, CV_RGB(0,0,255), 2);

}for(i = 0; i < newVLinePoints.len; i++){

CvPoint pt1, pt2;pt1.x = newVLinePoints.pt1J[i];pt1.y = newVLinePoints.pt1I[i];pt2.x = newVLinePoints.pt2J[i];pt2.y = newVLinePoints.pt2I[i];cvLine(colorImage, pt1, pt2, CV_RGB(0,0,255), 2);

}

WriteImage(colorImage, "linefitting.jpg");WriteImage(edgeImage, "edge.jpg");

for(k = 0; k < tempCornerMap.len; k++){i = tempCornerMap.ptI[k];j = tempCornerMap.ptJ[k];cvCircle(colorImage, cvPoint(j, i), 3, cvScalar(0, 255, 0), 3);

}WriteImage(colorImage, "estimatedcorner.jpg");

cvNamedWindow("output image", CV_WINDOW_AUTOSIZE);cvShowImage("output image", colorImage);cvWaitKey(0);cvDestroyWindow("output image");/////////////////////////////////////////////////////////////////////////

cvReleaseMemStorage(&storage);cvReleaseImage(&edgeImage);cvReleaseImage(&colorImage);

}

//// function : Get2LinePoints// usage : Get2LinePoints(lines, hLinePoints, vLinePoints, width, height);// ---------------------------------------------------------------------------

25

// this function calculates 2 points corresponding to every line.// also stores the points seperately in vertical line points or horizontal lines.// it is assumed that vertical lines pass through 2 horizontal image borders and// horizontal lines pass through 2 vertical image borders.//void Get2LinePoints(CvSeq *lines, PairPoints *hLinePoints,

PairPoints *vLinePoints, int imageWidth, int imageHeight) {int i;hLinePoints->len = 0;vLinePoints->len = 0;

// assign 2 points for every line// this block is from the course website linkfor(i = 0; i < lines->total; i++ ){

float* line = (float*)cvGetSeqElem(lines,i);float rho = line[0];float theta = line[1];CvPoint pt1, pt2;double a = cos(theta), b = sin(theta), c = tan(theta);if( fabs(b) < 0.001 ){

pt1.x = pt2.x = cvRound(rho);pt1.y = 0;pt2.y = imageHeight;

}else if( fabs(a) < 0.001 ){pt1.y = pt2.y = cvRound(rho);pt1.x = 0;pt2.x = imageWidth;

}else{pt1.x = 0;pt1.y = cvRound(rho/b);

if(pt1.y < 0){pt1.x = cvRound(rho/a);pt1.y = 0;

}if(pt1.y > imageHeight){

pt1.x = cvRound((pt1.y - imageHeight)*c);pt1.y = imageHeight;

}

pt2.x = imageWidth;pt2.y = cvRound(rho/b - imageWidth/c);if(pt2.y < 0){

pt2.x = cvRound(rho/a);pt2.y = 0;

}if(pt2.y > imageHeight){

26

pt2.x = cvRound(-1.0 * ((imageHeight - rho/b) * c));pt2.y = imageHeight;

}

}

// classify vertical lines and horizontal lines// assume that vertical lines pass through 2 horizontal image borders// horizontal lines pass through 2 vertical image bordersif(pt1.x == 0 && pt2.x == imageWidth && pt1.y != pt2.y){

hLinePoints->pt1J[hLinePoints->len] = pt1.x;hLinePoints->pt1I[hLinePoints->len] = pt1.y;hLinePoints->pt2J[hLinePoints->len] = pt2.x;hLinePoints->pt2I[hLinePoints->len] = pt2.y;hLinePoints->len++;

}else if(pt2.x == 0 && pt1.x == imageWidth && pt1.y != pt2.y){hLinePoints->pt1J[hLinePoints->len] = pt2.x;hLinePoints->pt1I[hLinePoints->len] = pt2.y;hLinePoints->pt2J[hLinePoints->len] = pt1.x;hLinePoints->pt2I[hLinePoints->len] = pt1.y;hLinePoints->len++;

}else if(pt1.y == 0 && pt2.y == imageHeight && pt1.x != pt2.x){vLinePoints->pt1J[vLinePoints->len] = pt1.x;vLinePoints->pt1I[vLinePoints->len] = pt1.y;vLinePoints->pt2J[vLinePoints->len] = pt2.x;vLinePoints->pt2I[vLinePoints->len] = pt2.y;vLinePoints->len++;

}else if(pt2.y == 0 && pt1.y == imageHeight && pt1.x != pt2.x){vLinePoints->pt1J[vLinePoints->len] = pt2.x;vLinePoints->pt1I[vLinePoints->len] = pt2.y;vLinePoints->pt2J[vLinePoints->len] = pt1.x;vLinePoints->pt2I[vLinePoints->len] = pt1.y;vLinePoints->len++;

}}

}

//// function : SortingLines;// usage : SortingLines(linePoints);// -------------------------------------------------// this function sorts lines in ascendant order//void SortingLines(PairPoints *linePoints) {

int mean1, mean2;int temp1, temp2, temp3, temp4;

27

int i, j;int len = linePoints->len;

for(i = len - 1; i > 0; i--){for(j = 0; j < i; j++){

mean1 = (linePoints->pt1I[j+1] + linePoints->pt2I[j+1]+ linePoints->pt1J[j+1] + linePoints->pt2J[j+1]) / 4;

mean2 = (linePoints->pt1I[j] + linePoints->pt2I[j]+ linePoints->pt1J[j] + linePoints->pt2J[j]) / 4;

if(mean2 > mean1){// swaptemp1 = linePoints->pt1I[j+1];temp2 = linePoints->pt1J[j+1];temp3 = linePoints->pt2I[j+1];temp4 = linePoints->pt2J[j+1];linePoints->pt1I[j+1] = linePoints->pt1I[j];linePoints->pt1J[j+1] = linePoints->pt1J[j];linePoints->pt2I[j+1] = linePoints->pt2I[j];linePoints->pt2J[j+1] = linePoints->pt2J[j];linePoints->pt1I[j] = temp1;linePoints->pt1J[j] = temp2;linePoints->pt2I[j] = temp3;linePoints->pt2J[j] = temp4;

}}

}}

//// function : GroupingLines// usage : GroupingLines(linePoints, newLinePoints);// ----------------------------------------------------// this function calculates new line point pairs.// input line point pairs should be in ascendant order.//void GroupingLines(PairPoints *linePoints, PairPoints *newLinePoints, intthreshold) {

int i = 0, j;

int len = newLinePoints->len = 0;

while(true){int tempPt1I = linePoints->pt1I[i];int tempPt1J = linePoints->pt1J[i];int tempPt2I = linePoints->pt2I[i];int tempPt2J = linePoints->pt2J[i];int numOfGroup = 1;

28

j = i+1;if(j >= linePoints->len) break;

while(true){if(j >= linePoints->len) break;float diff1 = fabs(linePoints->pt1I[i] - linePoints->pt1I[j]);float diff2 = fabs(linePoints->pt1J[i] - linePoints->pt1J[j]);float diff3 = fabs(linePoints->pt2I[i] - linePoints->pt2I[j]);float diff4 = fabs(linePoints->pt2J[i] - linePoints->pt2J[j]);

if(diff1 + diff2 + diff3 + diff4 < threshold){tempPt1I += linePoints->pt1I[j];tempPt1J += linePoints->pt1J[j];tempPt2I += linePoints->pt2I[j];tempPt2J += linePoints->pt2J[j];numOfGroup++;j++;

}else{break;

}}newLinePoints->pt1I[len] = cvRound(tempPt1I / numOfGroup);newLinePoints->pt1J[len] = cvRound(tempPt1J / numOfGroup);newLinePoints->pt2I[len] = cvRound(tempPt2I / numOfGroup);newLinePoints->pt2J[len] = cvRound(tempPt2J / numOfGroup);

i = j;len++;

}newLinePoints->len = len;

}

//// function : EstimateIndexedCorner// usage : EstimateIndexedCorner(hLinePoints, vLinePoints, cornerMap);//------------------------------------------------------------------------// this function estimates indexed corners with crossing of ordered// horizontal lines and ordered vertical lines.// the cornerMap array is stroed in order.//void EstimateIndexedCorner(PairPoints *hLinePoints, PairPoints *vLinePoints,

Points *cornerMap){

int i, j;int k = 0;

29

for(i = 0; i < hLinePoints->len; i++){float hPt1[3] = {hLinePoints->pt1J[i], hLinePoints->pt1I[i], 1};float hPt2[3] = {hLinePoints->pt2J[i], hLinePoints->pt2I[i], 1};float hLine[3] = {0, 0, 0}; // initialize// get a horiziontal line that passes through hPt1 and hPt2PassLineOf2Points(hPt1, hPt2, hLine);

for(j = 0; j < vLinePoints->len; j++){float vPt1[3] = {vLinePoints->pt1J[j], vLinePoints->pt1I[j], 1};float vPt2[3] = {vLinePoints->pt2J[j], vLinePoints->pt2I[j], 1};float vLine[3] = {0, 0, 0}; // initialize// get a vertical line that passes through vPt1 and vPt2PassLineOf2Points(vPt1, vPt2, vLine);

float pt[3] = {0, 0, 0}; // initialize// get a cross point of a horizontal and a vertical linesCrossPointOf2Lines(vLine, hLine, pt);

// store the cross point in the corner mapcornerMap->ptI[k] = cvRound(pt[1]/pt[2]);cornerMap->ptJ[k] = cvRound(pt[0]/pt[2]);k++;

}}cornerMap->len = k;

}

//// function : RefineIndexedCorner// usage: RefineIndexedCorner(inImage, tempCornerMap, cornerMap);//---------------------------------------------------------------------// this function refines indexed corners(corner map) with true corners.// true corners are obtained using Harris corner detection.//void RefineIndexedCorner(IplImage *inImage, Points *tempCornerMap,

Points *cornerMap){

int i, j, k;int initialDistance = 10000;float dist, minDist;int blockCenterI, blockCenterJ;int r = SEARCH_RANGE / 2;int height = inImage->height;int width = inImage->width;

30

IplImage *cornerImage = cvCreateImage(cvSize(width, height), IPL_DEPTH_8U, 1);uchar *cornerImageData = 0;int step = cornerImage->widthStep;cornerImageData = (uchar *)cornerImage->imageData;

HarrisCornerDectect(inImage, cornerImage, HARRIS_THRESHOLD);

for(k = 0; k < tempCornerMap->len; k++){blockCenterI = tempCornerMap->ptI[k];blockCenterJ = tempCornerMap->ptJ[k];

// find true corner that has the minimum distance between// the estimated corner and the true cornerminDist = initialDistance;for(i = blockCenterI - r; i <= blockCenterI + r; i++){

for(j = blockCenterJ - r; j <= blockCenterJ + r; j++){i = max(min(height - 1, i), 0);j = max(min(width - 1, j), 0);if(cornerImageData[i*step + j] == true){

dist = pow(blockCenterI - i, 2) + pow(blockCenterJ - j, 2);if(dist < minDist){

cornerMap->ptI[k] = i;cornerMap->ptJ[k] = j;minDist = dist;

}}

}}if(minDist == initialDistance){

printf("error in refine indexed corner: need larger search area\n");exit(0);

}}

// new number of corners should be the same as// number of initially estimated cornerscornerMap->len = tempCornerMap->len;

}

3.4 Program Header File : homography.h

//// file : homography.h//------------------------------------------------// this file contains functions for homography// estimation

31

//

#define T_SQUARE 36 // 6 * sigma^2 ; as sigma^2 = 6

void HomograhyEstimation(PairPoints *corspMap, CvMat *H);

void RansacHomograhyEstimation(PairPoints *corspMap, PairPoints *inlierMap,CvMat *H);

void ComputeHomography(float domainPosiitons[][2], float rangePosiitons[][2],int numOfCorresp, CvMat *H);

void DataNormalization(int numOfPositions, float positions[][2], CvMat *T);

void CalculateDistance(CvMat *H, PairPoints *corspMap, PairPoints *inlierMap);

bool IsGoodSample(float points[][2], int numOfPoints);

bool IsColinear(CvMat *A, CvMat *B, CvMat *C);

3.5 Program Function File : homography.cpp

//// file : homography.cpp//------------------------------------------------// this file contains functions for homography// estimation//

#include <stdlib.h>#include <stdio.h>#include <math.h>#include "cv.h"#include "cxcore.h"#include "highgui.h"#include "utility.h"#include "homography.h"

void HomograhyEstimation(PairPoints *corspMap, CvMat *H) {int i;int numOfCorresp = corspMap->len;;float domainPositions[numOfCorresp][2], rangePositions[numOfCorresp][2];

for(i = 0; i < numOfCorresp; i++){// I -> y, J -> xdomainPositions[i][1] = corspMap->pt1I[i];

32

domainPositions[i][0] = corspMap->pt1J[i];rangePositions[i][1] = corspMap->pt2I[i];rangePositions[i][0] = corspMap->pt2J[i];

}// compute the homographyComputeHomography(domainPositions, rangePositions, numOfCorresp, H);

}

void RansacHomograhyEstimation(PairPoints *corspMap, PairPoints *inlierMap,CvMat *H) {

int numOfCorresp = 4;float domainPositions[numOfCorresp][2], rangePositions[numOfCorresp][2];int i, sampleCount = 0;int pos;float p, e, outlierProb;int numOfInliers, totalNumOfPoints, maxNumOfInliers = 1;PairPoints tempInlierMap;CvMat *Htmp;Htmp = cvCreateMat(3, 3, CV_32FC1);

p = 0.99;e = 0.5;float N = 1000;while(N > sampleCount){

// pick 4 corresponding pointsfor(i = 0; i < numOfCorresp; i++){

pos = rand() % corspMap->len; // select random positions// I -> y, J -> xdomainPositions[i][1] = corspMap->pt1I[pos];domainPositions[i][0] = corspMap->pt1J[pos];rangePositions[i][1] = corspMap->pt2I[pos];rangePositions[i][0] = corspMap->pt2J[pos];

}

// check whether samples aree good or not.4// if the seleted samples are good, then do homography estimation// else reselect samples.if(IsGoodSample(domainPositions, numOfCorresp) &&

IsGoodSample(rangePositions, numOfCorresp)){// compute the homographyComputeHomography(domainPositions, rangePositions, numOfCorresp, Htmp);

// calculate the distance for each correspondences// compute the number of inliersInitializePairPoints(&tempInlierMap);CalculateDistance(Htmp, corspMap, &tempInlierMap);

33

// choose H with the largest number of inliearsnumOfInliers = tempInlierMap.len;if(numOfInliers > maxNumOfInliers){

maxNumOfInliers = numOfInliers;CopyPairPoints(inlierMap, &tempInlierMap);cvCopy(Htmp, H, 0);

}

// adaptive algorithm for determining the number of RANSAC samples// textbook algorithm 4.6totalNumOfPoints = corspMap->len;outlierProb = 1 - ((float)maxNumOfInliers / (float)totalNumOfPoints);e = (e < outlierProb ? e : outlierProb);N = log(1 - p) / log(1 - pow((1 - e), numOfCorresp));sampleCount += 1;

}}

}

//// function : ComputeHomography// usage : ComputeHomography(domainPositions, rangePositions,// numOfCorresp, H);// -----------------------------------------------------------// This function calculate the homography, H, using the set of// given pairs of correspodences.// Before computing the homography, data normalization will be// performed. Then, it solve Ah = 0 using SVD to get H.//void ComputeHomography(float domainPositions[][2], float rangePositions[][2],

int numOfCorresp, CvMat *H){

int column = 9, row;float x1, y1, w1, x2, y2, w2;int i, ii, jj;float h[9];

if(numOfCorresp == 4){row = 3; // eq 4.1 : make a thin matrix to solve SVD in opencv

}else if(numOfCorresp > 4){row = 2; //eq 4.3

}else{printf("Need more correpondence points! for computing H.\n");exit(0);

}

34

float a[row * column];

CvMat *A, *Htmp;CvMat *T1, *T2;CvMat *D, *V, *U;CvMat *invT2, *temp;

// normalizationT1 = cvCreateMat(3, 3, CV_32FC1);T2 = cvCreateMat(3, 3, CV_32FC1);DataNormalization(numOfCorresp, domainPositions, T1);DataNormalization(numOfCorresp, rangePositions, T2);

// set AA = cvCreateMat(numOfCorresp * row, column, CV_32FC1);

for(i = 0; i < numOfCorresp; i++){x1 = domainPositions[i][0];y1 = domainPositions[i][1];w1 = 1;

x2 = rangePositions[i][0];y2 = rangePositions[i][1];w2 = 1;

// set Ai// [0, 0, 0, -w2*x1, -w2*y1, -w2*w1, y2*x1, y2*y1, y2*w1]// [w2*x1, w2*y1, w2*w1, 0, 0, 0, -x2*x1, -x2*y1, -x2*w1]a[0] = 0; a[1] = 0; a[2] = 0;a[3] = -w2*x1; a[4] = -w2*y1; a[5] = -w2*w1;a[6] = y2*x1; a[7] = y2*y1; a[8] = y2*w1;

a[9] = w2*x1; a[10] = w2*y1; a[11] = w2*w1;a[12] = 0; a[13] = 0; a[14] = 0;a[15] = -x2*x1; a[16] = -x2*y1; a[17] = -x2*w1;

if(row == 3){a[18] = -y2*x1; a[19] = -y2*y1; a[20] = -y2*w1;a[21] = x2*x1; a[22] = x2*y1; a[23] = x2*w1;a[24] = 0; a[25] = 0; a[26] = 0;

}

// assemble Ai into a matrix Afor(ii = 0; ii < row; ii++){

for(jj = 0; jj < column; jj++){cvmSet(A, ii + i*row, jj, a[ii*column + jj]);

}

35

}}

// calculate HHtmp = cvCreateMat(3, 3, CV_32FC1);D = cvCreateMat(numOfCorresp*row, column, CV_32FC1);U = cvCreateMat(numOfCorresp*row, numOfCorresp*row, CV_32FC1);V = cvCreateMat(column, column, CV_32FC1);cvSVD(A, D, U, V, CV_SVD_U_T|CV_SVD_V_T); // : opencv setting// A = U^T D V in openCV : A = U’ D’ V’^T in text

// take last column of V’ : last row of Vfor(i = 0; i < column; i++){

h[i] = cvmGet(V, column-1, i);}

Array2CvMat(h, Htmp, 3, 3);

// denormalization : H = invT2 * Htmp * T1 <- Htmp = T2 * H * invT1invT2 = cvCreateMat(3, 3, CV_32FC1);temp = cvCreateMat(3, 3, CV_32FC1);cvInvert(T2, invT2);cvMatMul(invT2, Htmp, temp);cvMatMul(temp,T1, H);

// release matricescvReleaseMat(&T1); cvReleaseMat(&T2);cvReleaseMat(&A); cvReleaseMat(&Htmp);cvReleaseMat(&D); cvReleaseMat(&U); cvReleaseMat(&V);cvReleaseMat(&T1); cvReleaseMat(&temp);

}

//// function : DataNormalization// usage : DataNormalization(numOfx, x, T);// ------------------------------------------------------// Thsi function normalizes x and returns the similarity// transform, T.// The centroid of x will be transformed into (0,0).// The average distance of normalized x will be sqrt(2).//void DataNormalization(int numOfPositions, float positions[][2], CvMat *T) {

int i;float sumI = 0, sumJ = 0, meanI = 0, meanJ = 0;float squareDist = 0, sumDist = 0, meanDist = 0;float scale = 0;

36

float x, y, xx, yy, ww;

// calculate the centroidfor(i = 0; i < numOfPositions; i++){

sumI += positions[i][0];sumJ += positions[i][1];

}meanI = sumI / numOfPositions;meanJ = sumJ / numOfPositions;

// calculate the mean distancefor(i = 0; i < numOfPositions; i++){

squareDist = pow(positions[i][0] - meanI, 2)+ pow(positions[i][1] - meanJ, 2);

sumDist += pow(squareDist, 0.5);}meanDist = sumDist / numOfPositions;

// set the similarity transformscale = pow(1, 0.5) / meanDist;float t[9] = {scale, 0, -scale * meanI,

0, scale, -scale * meanJ,0, 0, 1};

Array2CvMat(t, T, 3, 3);

// data normalizationfor(i = 0; i < numOfPositions; i++){

x = positions[i][0];y = positions[i][1];

xx = t[0] * x + t[1] * y + t[2];yy = t[3] * x + t[4] * y + t[5];ww = t[6] * x + t[7] * y + t[8];

xx = xx / ww;yy = yy / ww;

positions[i][0] = xx;positions[i][1] = yy;

}}

//// function : CalculateDistance// usage : CalculateDistance(H, corspMap, inlierMap);// ---------------------------------------------------

37

// This function calculates distance of data using// symmetric transfer error. Then, compute inliears// that consist with H.//void CalculateDistance(CvMat *H, PairPoints *corspMap, PairPoints *inlierMap) {

int i;int x1, y1, x2, y2;float x1Trans, y1Trans, w1Trans, x2Trans, y2Trans, w2Trans;float dist2x1AndInvHx2, dist2x2AndHx1, dist2Trans;float tSquare = T_SQUARE;CvMat *invH = cvCreateMat(3, 3, CV_32FC1);

cvInvert(H, invH);

// use d^2_transfer as distance measurefor(i = 0; i < corspMap->len; i++){

// I -> y, J -> xx1 = corspMap->pt1J[i];y1 = corspMap->pt1I[i];x2 = corspMap->pt2J[i];y2 = corspMap->pt2I[i];

// calculate x_trans = H * xx2Trans = cvmGet(H, 0, 0) * x1 + cvmGet(H, 0, 1) * y1

+ cvmGet(H, 0, 2);y2Trans = cvmGet(H, 1, 0) * x1 + cvmGet(H, 1, 1) * y1

+ cvmGet(H, 1, 2);w2Trans = cvmGet(H, 2, 0) * x1 + cvmGet(H, 2, 1) * y1

+ cvmGet(H, 2, 2);x2Trans = x2Trans / w2Trans;y2Trans = y2Trans / w2Trans;

// calculate x’_trans = H^(-1) * x’x1Trans = cvmGet(invH, 0, 0) * x2 + cvmGet(invH, 0, 1) * y2

+ cvmGet(invH, 0, 2);y1Trans = cvmGet(invH, 1, 0) * x2 + cvmGet(invH, 1, 1) * y2

+ cvmGet(invH, 1, 2);w1Trans = cvmGet(invH, 2, 0) * x2 + cvmGet(invH, 2, 1) * y2

+ cvmGet(invH, 2, 2);x1Trans = x1Trans / w1Trans;y1Trans = y1Trans / w1Trans;

// calculate the square distance (symmetric transfer error)dist2x1AndInvHx2 = pow(x1 - x1Trans, 2) + pow(y1 - y1Trans, 2);dist2x2AndHx1 = pow(x2 - x2Trans, 2) + pow(y2 - y2Trans, 2);dist2Trans = dist2x1AndInvHx2 + dist2x2AndHx1;

38

if(dist2Trans < tSquare){UpdatePairPoints(inlierMap, y1, x1, y2, x2);

}}

// release matricescvReleaseMat(&invH);

}

//// function : IsGoodSample// usage : r = IsGoodSample(points, numOfPoints)// -------------------------------------------------// This function checks colinearity of all given points.//bool IsGoodSample(float points[][2], int numOfPoints) {

bool r;int i, j, k;

CvMat *A = cvCreateMat(3, 1, CV_32FC1);CvMat *B = cvCreateMat(3, 1, CV_32FC1);CvMat *C = cvCreateMat(3, 1, CV_32FC1);

i = 0;j = i + 1;k = j + 1;r = false;// check colinearity recursivelywhile(true){

// set point vectorscvmSet(A, 0, 0, points[i][0]);cvmSet(A, 1, 0, points[i][1]);cvmSet(A, 2, 0, 1);cvmSet(B, 0, 0, points[j][0]);cvmSet(B, 1, 0, points[j][1]);cvmSet(B, 2, 0, 1);cvmSet(C, 0, 0, points[k][0]);cvmSet(C, 1, 0, points[k][1]);cvmSet(C, 2, 0, 1);

// check linearityr = IsColinear(A, B, C) || r;

// update point indexif(k < numOfPoints - 1){

k += 1;}else{

39

if(j < numOfPoints - 2){j += 1;k = j + 1;

}else{if(i < numOfPoints - 3){

i += 1;j = i + 1;k = j + 1;

}else{break;

}}

}}

return(!r);}

//// function : IsColinear// usage : r = IsColinear(A, B, C);// --------------------------------------// This function checks the colinearity of// the given 3 points A, B, and C.// If these are colinear, it returns false.//bool IsColinear(CvMat *A, CvMat *B, CvMat *C) {

float d;CvMat *lineAB;

lineAB = cvCreateMat(3, 1, CV_32FC1);

cvCrossProduct(A, B, lineAB);d = cvDotProduct(lineAB, C);

// release matricescvReleaseMat(&lineAB);

if((d < EPS) && (d > -EPS)){return(true);

}else{return(false);

}}

40

3.6 Program Header File : calibration.h

//// file : calibration.h//------------------------------------------------// this file contains functions for calibration//

#define MAX_NUM_IMAGE 6#define MAX_POINT_SIZE 500#define NUM_OF_ITERATION 1000#define LAMBDA 0.01

typedef struct{int len;// intrinsic parametersfloat alphaX;float alphaY;float x0;float y0;float skew;// extrinsic parametersfloat r1[3][MAX_NUM_IMAGE];float r2[3][MAX_NUM_IMAGE];float r3[3][MAX_NUM_IMAGE];float t[3][MAX_NUM_IMAGE];// radial distortionfloat k1;float k2;

}CameraParam;

typedef struct{int len;int numPointsRow;int numPointsColumn;float H[3][3][MAX_NUM_IMAGE];

}HomographySet;

typedef struct{int pointLen;int imageLen;int ptmI[MAX_POINT_SIZE][MAX_NUM_IMAGE];int ptmJ[MAX_POINT_SIZE][MAX_NUM_IMAGE];int ptiI[MAX_POINT_SIZE][MAX_NUM_IMAGE];int ptiJ[MAX_POINT_SIZE][MAX_NUM_IMAGE];

}PairPointSet;

41

void CameraCalibration(HomographySet *h, CameraParam *cp, PairPointSet*imagePt);

void CalculateIntrinsicMatrix(HomographySet *h, CvMat *K, CameraParam *cp);

void CalculateExtrinsicParameters(HomographySet *h, CvMat *K, CameraParam *cp);

void vijSetting(float *v, int i, int j, CvMat *H);

void CalculateRadialDistotion(HomographySet *h, CvMat *K, CameraParam *cp,PairPointSet *imagePt);

void RefineCameraParameters(CameraParam *cp, PairPointSet *imagePt);

double CalculateError(CameraParam *cp, PairPointSet *imagePt, CvMat *d);

void Rodrigues2R(float wx, float wy, float wz, CvMat *R);

void WriteParameters(CameraParam *cp, char *name);

3.7 Program Function File : calibration.cpp

//// file : calibration.cpp//------------------------------------------------// this file contains functions for calibration//

#include <stdlib.h>#include <stdio.h>#include <math.h>#include "cv.h"#include "cxcore.h"#include "highgui.h"#include "utility.h"#include "calibration.h"#include "jacobianCalculation.h"

void CameraCalibration(HomographySet *h, CameraParam *cp, PairPointSet *imagePt){

CvMat *K = cvCreateMat(3, 3, CV_32FC1); // intrinsic matrix

// direct calibration method (algebraic minimization)CalculateIntrinsicMatrix(h, K, cp);

42

CalculateExtrinsicParameters(h, K, cp);CalculateRadialDistotion(h, K, cp, imagePt);WriteParameters(cp, "InitialCalibration.txt");

// refinementRefineCameraParameters(cp, imagePt);WriteParameters(cp, "RefinedCalibration.txt");

}

void CalculateIntrinsicMatrix(HomographySet *h, CvMat *K, CameraParam *cp) {int i, n, m;int numOfImage = h->len;int numColumnVi = 6, numRowVi = 2;CvMat *H = cvCreateMat(3, 3, CV_64FC1);CvMat *Vi = cvCreateMat(numRowVi, numColumnVi, CV_64FC1);CvMat *V = cvCreateMat(numRowVi*numOfImage, numColumnVi, CV_64FC1);

// set matrix Vfor(i = 0; i < numOfImage; i++){

for(n = 0; n < 3; n++){for(m = 0; m < 3; m++){

cvmSet(H, n, m, h->H[n][m][i]);}

}

float v12[numColumnVi], v11[numColumnVi], v22[numColumnVi];vijSetting(v12, 1, 2, H);vijSetting(v11, 1, 1, H);vijSetting(v22, 2, 2, H);

// Vi = [ v12 ]// [ v11 - v22 ]for(m = 0; m < numColumnVi; m++){

cvmSet(Vi, 0, m, v12[m]);cvmSet(Vi, 1, m, v11[m] - v22[m]);

}

// V = [V1]// [V2]// [..]// [Vn]for(n = 0; n < numRowVi; n++){

for(m = 0; m < numColumnVi; m++){cvmSet(V, n + i*numRowVi, m, cvmGet(Vi, n, m));

43

}}

}

// calculate the absolute conic W : Vwdouble w[9];CvMat *Dsvd = cvCreateMat(numRowVi*numOfImage, numColumnVi, CV_64FC1);CvMat *Usvd = cvCreateMat(numRowVi*numOfImage, numRowVi*numOfImage, CV_64FC1);CvMat *Vsvd = cvCreateMat(numColumnVi, numColumnVi, CV_64FC1);cvSVD(V, Dsvd, Usvd, Vsvd, CV_SVD_U_T|CV_SVD_V_T);// V = Usvd^T Dsvd Vsvd in openCV : V = U’ D’ V’^T in text// take last column of V’ : last row of Vsvd// w = [w11 , w12 , w22 , w13 , w23 , w33]for(i = 0; i < numColumnVi; i++){

w[i] = cvmGet(Vsvd, numColumnVi-1, i);}

// y0 = (w12 * w13 - w11 * w23) / (w11 * w22 - w12^2)// lambda = w33 - (w13^2 + y0 * (w12 * w13 - w11 * w23)) / w11// alphaX = (lambda / w11)^.5// alphaY = (lambda * w11 / (w11 * w22 - w12^2))^.5// skew = -w12 * alphaX^2 * alphaY / lambda// x0 = skew * y0 / alphaY - w13 * alphaX^2 / lambdadouble y0 = (w[1]* w[3] - w[0] * w[4]) / (w[0] * w[2] - pow(w[1], 2));double lambda = w[5] - (pow(w[3], 2) + y0 * (w[1] * w[3] - w[0] * w[4])) / w[0];double alphaX = sqrt(lambda / w[0]);double alphaY = sqrt((lambda * w[0]) / (w[0] * w[2] - pow(w[1], 2)));double skew = -w[1] * pow(alphaX, 2) * alphaY / lambda;double x0 = (skew * y0) / alphaY - (w[3] * pow(alphaX, 2)) / lambda;

// get intrinsic matrix Kfloat kArr[9] = {alphaX, skew, x0,

0 , alphaY, y0,0 , 0 , 1};

Array2CvMat(kArr, K, 3, 3);

// store extrinsic parameterscp->alphaX = alphaX;cp->alphaY = alphaY;cp->x0 = x0;cp->y0 = y0;cp->skew = skew;

// release matricescvReleaseMat(&H); cvReleaseMat(&Vi); cvReleaseMat(&V);cvReleaseMat(&Dsvd); cvReleaseMat(&Usvd); cvReleaseMat(&Vsvd);

44

}

//// function : vijSetting// usage : vijSetting(vij, i, j, H);//--------------------------------------------------// this function sets vij vector for constructing V.//void vijSetting(float *v, int i, int j, CvMat *H) {

// following Zhang’s notation// hi : ith column vector (n.b. hij is not (H)ij but (H)ji)float hi1 = cvmGet(H, 0, i-1);float hi2 = cvmGet(H, 1, i-1);float hi3 = cvmGet(H, 2, i-1);float hj1 = cvmGet(H, 0, j-1);float hj2 = cvmGet(H, 1, j-1);float hj3 = cvmGet(H, 2, j-1);

float vij[6] = {hi1*hj1 , hi1*hj2 + hi2*hj1 , hi2*hj2 ,hi3*hj1 + hi1*hj3 , hi3*hj2 + hi2*hj3 , hi3*hj3};

for(int k = 0; k < 6; k++){v[k] = vij[k];

}}

void CalculateExtrinsicParameters(HomographySet *h, CvMat *K, CameraParam *cp){

int imageNumber, k;int numOfImages = h->len;CvMat *invK = cvCreateMat(3, 3, CV_32FC1);CvMat *h1 = cvCreateMat(3, 1, CV_32FC1);CvMat *h2 = cvCreateMat(3, 1, CV_32FC1);CvMat *h3 = cvCreateMat(3, 1, CV_32FC1);CvMat *r1 = cvCreateMat(3, 1, CV_32FC1);CvMat *r2 = cvCreateMat(3, 1, CV_32FC1);CvMat *r3 = cvCreateMat(3, 1, CV_32FC1);CvMat *t = cvCreateMat(3, 1, CV_32FC1);CvMat *D = cvCreateMat(3, 3, CV_64FC1);CvMat *U = cvCreateMat(3, 3, CV_64FC1);CvMat *V = cvCreateMat(3, 3, CV_64FC1);CvMat *Q = cvCreateMat(3, 3, CV_64FC1);CvMat *R = cvCreateMat(3, 3, CV_64FC1);CvMat *transU = cvCreateMat(3, 3, CV_64FC1);

45

cvInvert(K, invK);

for(imageNumber = 0; imageNumber < numOfImages ; imageNumber++){

for(k = 0; k < 3; k++){cvmSet(h1, k, 0, h->H[k][0][imageNumber]);cvmSet(h2, k, 0, h->H[k][1][imageNumber]);cvmSet(h3, k, 0, h->H[k][2][imageNumber]);

}

// calculate r1 r2 r3 and tcvMatMul(invK, h1, r1); // r1 = lambda * invK h1cvMatMul(invK, h2, r2); // r2 = lambda * invK h2cvMatMul(invK, h3, t); // t = lambda * invK h3// cvCrossProduct(r1, r2, r3); //r3 = r1 X r2

float squareSum = 0;// calculate normalization factor lambdafor(k = 0; k < 3; k++){

squareSum += pow(cvmGet(r1, k, 0), 2);}float lambda = 1 / sqrt(squareSum);

// normalizationfor(k = 0; k < 3; k++){

cvmSet(t, k, 0, lambda * cvmGet(t, k, 0));// followings are not accuratecvmSet(r1, k, 0, lambda * cvmGet(r1, k, 0));cvmSet(r2, k, 0, lambda * cvmGet(r2, k, 0));

}cvCrossProduct(r1, r2, r3); //r3 = r1 X r2

///////////////////////////////////////////////// refine rotation matrix //// estimate best possible R //// min||R - Q||_F, s.t R^TR = I /////////////////////////////////////////////////// Q : initially estimated rotation matrix, Q = [r1 r2 r3]for(k = 0; k < 3; k++){

cvmSet(Q, k, 0, cvmGet(r1, k, 0));cvmSet(Q, k, 1, cvmGet(r2, k, 0));cvmSet(Q, k, 2, cvmGet(r3, k, 0));

}

// calculate RcvSVD(Q, D, U, V, CV_SVD_U_T|CV_SVD_V_T);

46

// Q = U^T D V in openCV : Q = U’ D’ V’^T in text// R = U’V’^T : R = U^TVcvTranspose(U, transU);cvMatMul(transU, V, R);

// store extrinsic parametersfor(k = 0; k < 3; k++){

cp->r1[k][imageNumber] = cvmGet(R, k, 0);cp->r2[k][imageNumber] = cvmGet(R, k, 1);cp->r3[k][imageNumber] = cvmGet(R, k, 2);cp->t[k][imageNumber] = cvmGet(t, k, 0);

}

/* in this case, rotation matrix has errorfor(k = 0; k < 3; k++){

cp->r1[k][imageNumber] = cvmGet(r1, k, 0);cp->r2[k][imageNumber] = cvmGet(r2, k, 0);cp->r3[k][imageNumber] = cvmGet(r3, k, 0);cp->t[k][imageNumber] = cvmGet(t, k, 0);

}*/

}

// release matricescvReleaseMat(&invK);cvReleaseMat(&h1); cvReleaseMat(&h2); cvReleaseMat(&h3);cvReleaseMat(&r1); cvReleaseMat(&r2); cvReleaseMat(&r3); cvReleaseMat(&t);cvReleaseMat(&D); cvReleaseMat(&U); cvReleaseMat(&V);cvReleaseMat(&Q); cvReleaseMat(&R); cvReleaseMat(&transU);

}

void CalculateRadialDistotion(HomographySet *h, CvMat *K, CameraParam *cp,PairPointSet *imagePt) {

int imageNumber, i, m;int numOfImages = h->len;int numOfPoints = imagePt->pointLen;double x1, x2, x3, u, v, X, Y;

CvMat *A = cvCreateMat(3, 3, CV_32FC1);CvMat *H = cvCreateMat(3, 3, CV_32FC1);CvMat *Di = cvCreateMat(2, 2, CV_64FC1);CvMat *di = cvCreateMat(2, 1, CV_64FC1);CvMat *D = cvCreateMat(2 * numOfImages * numOfPoints, 2, CV_64FC1);CvMat *d = cvCreateMat(2 * numOfImages * numOfPoints, 1, CV_64FC1);

47

CvMat *kLS = cvCreateMat(2, 1, CV_64FC1);

for(imageNumber = 0; imageNumber < numOfImages; imageNumber++){//H = KA, where A = [r1 r2 t];for(i = 0; i < 3; i++){

cvmSet(A, i, 0, cp->r1[i][imageNumber]);cvmSet(A, i, 1, cp->r2[i][imageNumber]);cvmSet(A, i, 2, cp->t[i][imageNumber]);

}cvMatMul(K, A, H);

float h[9];CvMat2Array(H, h, 3, 3);

// get ideal pointsfor(i = 0; i < imagePt->pointLen; i++){

X = imagePt->ptmJ[i][imageNumber];Y = imagePt->ptmI[i][imageNumber];

x1 = h[0] * X + h[1] * Y + h[2];x2 = h[3] * X + h[4] * Y + h[5];x3 = h[6] * X + h[7] * Y + h[8];// get ideal pointsu = x1 / x3;v = x2 / x3;

// set Di and didouble tpU = (u - cp->x0);double tpV = (v - cp->y0);double r = pow(tpU / cp->alphaX , 2) + pow(tpV / cp->alphaY, 2);cvmSet(Di, 0, 0, tpU * r);cvmSet(Di, 0, 1, tpU * pow(r, 2));cvmSet(Di, 1, 0, tpV * r);cvmSet(Di, 1, 1, tpV * pow(r, 2));cvmSet(di, 0, 0, imagePt->ptiJ[i][imageNumber] - u);cvmSet(di, 1, 0, imagePt->ptiI[i][imageNumber] - v);

// D = [D1]// [D2]// [..]// [Dn]for(m = 0; m < 2; m++){

cvmSet(D, i*2 + imageNumber*numOfPoints*2, m, cvmGet(Di, 0, m));cvmSet(D, i*2 + 1 + imageNumber*numOfPoints*2, m, cvmGet(Di, 1, m));

}// d = [d1]// [d2]

48

// [..]// [dn]cvmSet(d, i*2 + imageNumber * numOfPoints*2, 0, cvmGet(di, 0, 0));cvmSet(d, i*2 + 1 + imageNumber * numOfPoints*2, 0, cvmGet(di, 1, 0));

}}

// kLS = inv(D’D)D’d// LeastSquare(D, d, kLS);cvSolve(D, d, kLS, CV_SVD);

cp->k1 = cvmGet(kLS, 0, 0);cp->k2 = cvmGet(kLS, 1, 0);

// release matricescvReleaseMat(&A); cvReleaseMat(&H);cvReleaseMat(&Di); cvReleaseMat(&di);cvReleaseMat(&D); cvReleaseMat(&d);cvReleaseMat(&kLS);

}

//// function : RefineCameraParameters// usage : RefineCameraParameters(cp, imagePt);//-----------------------------------------------------------------// this function refines camera parameters using LM algorithm// in this function, error(geometric distance) is calcualted using// whole image set, however, camera parameters are updated image by image// because Jacobian calculation for whole image set is somewhat not preferable//void RefineCameraParameters(CameraParam *cp, PairPointSet *imagePt) {

int numOfImages = imagePt->imageLen;int numOfData = 2 * imagePt->pointLen * numOfImages;int numOfParams = 7 + numOfImages * 6;// Jx & Jy size : 7 + numOfImage * 6int numOfIteration = NUM_OF_ITERATION;int numOfPoints = imagePt->pointLen;int iterationNum;double e, eNew;double lambda = LAMBDA;bool update = true;int i, m, k;double X, Y;double theta, a;double Jx[numOfParams], Jy[numOfParams];

49

CvScalar tr;double au;double av;double u0;double v0;double sk;double k1;double k2;double wx[numOfImages], wy[numOfImages], wz[numOfImages];double tx[numOfImages], ty[numOfImages], tz[numOfImages];double wxNew, wyNew, wzNew;float stepSize = 1; // default stepSize = 1CvMat *R = cvCreateMat(3, 3, CV_64FC1);CvMat *J = cvCreateMat(numOfData, numOfParams, CV_64FC1);CvMat *Hessian = cvCreateMat(numOfParams, numOfParams, CV_64FC1);CvMat *invHessian = cvCreateMat(numOfParams, numOfParams, CV_64FC1);CvMat *transJ = cvCreateMat(numOfParams, numOfData, CV_64FC1);CvMat *Jerr = cvCreateMat(numOfParams, 1, CV_64FC1);CvMat *dp = cvCreateMat(numOfParams, 1, CV_64FC1);CvMat *d = cvCreateMat(numOfData, 1, CV_64FC1);CvMat *dLM = cvCreateMat(numOfData, 1, CV_64FC1);

e = CalculateError(cp, imagePt, d);printf("initial error = %f\n", e);for(iterationNum = 0; iterationNum < numOfIteration; iterationNum++){

if(update == true){for(k = 0; k < numOfImages; k++){

//Rodigrues representationfor(m = 0; m < 3; m++){

cvmSet(R, m, 0, cp->r1[m][k]);cvmSet(R, m, 1, cp->r2[m][k]);cvmSet(R, m, 2, cp->r3[m][k]);

}tr = cvTrace(R); // trace(R) -> tr.val[0]theta = acos((tr.val[0] - 1) / 2.0);a = (theta / (2 * sin(theta)));wx[k] = a * (cvmGet(R, 2, 1) - cvmGet(R, 1, 2));wy[k] = a * (cvmGet(R, 0, 2) - cvmGet(R, 2, 0));wz[k] = a * (cvmGet(R, 1, 0) - cvmGet(R, 0, 1));tx[k] = cp->t[0][k];ty[k] = cp->t[1][k];tz[k] = cp->t[2][k];au = cp->alphaX;av = cp->alphaY;u0 = cp->x0;v0 = cp->y0;sk = cp->skew;

50

k1 = cp->k1;k2 = cp->k2;

for(i = 0; i < numOfPoints; i++){

//Evaluate the Jacobian at the current parameter values// and the values of geometric distanceX = imagePt-> ptmJ[i][k];Y = imagePt-> ptmI[i][k];

CalculateJocobian(Jx, Jy, k, numOfParams,X, Y, au, av, u0, v0, sk,wx[k], wy[k], wz[k], tx[k], ty[k], tz[k],k1, k2);

for(m = 0; m < numOfParams; m++){cvmSet(J, i * 2 + k * numOfPoints * 2, m, Jx[m]);cvmSet(J, i * 2 + 1 + k * numOfPoints * 2, m, Jy[m]);

}

}}// compute the approximated Hessian matrixcvMulTransposed (J, Hessian, 1); // Hessian =J’J

}// update camera parameters (cp)// apply the damping factor to the Hessian matrixfor(m = 0; m < numOfParams; m++){

double tmp = cvmGet(Hessian, m, m);cvmSet(Hessian, m, m, tmp + lambda);

}

// compute the updated parameters// dp = inv(Hessian)*(J’*d(:));cvTranspose(J, transJ);cvMatMul(transJ, d, Jerr);cvInvert(Hessian, invHessian, CV_SVD_SYM);cvMatMul(invHessian, Jerr, dp);

cp->alphaX = au - cvmGet(dp, 0, 0) * stepSize;cp->alphaY = av - cvmGet(dp, 1, 0) * stepSize;cp->x0 = u0 - cvmGet(dp, 2, 0) * stepSize;cp->y0 = v0 - cvmGet(dp, 3, 0) * stepSize;cp->skew = sk - cvmGet(dp, 4, 0) * stepSize;cp->k1 = k1 - cvmGet(dp, numOfParams - 2, 0) * stepSize;cp->k2 = k2 - cvmGet(dp, numOfParams - 1, 0) * stepSize;

for(k = 0; k < numOfImages; k++){

51

wxNew = wx[k] - cvmGet(dp, 5 + k * 6, 0) * stepSize;wyNew = wy[k] - cvmGet(dp, 6 + k * 6, 0) * stepSize;wzNew = wz[k] - cvmGet(dp, 7 + k * 6, 0) * stepSize;cp->t[0][k] = tx[k] - cvmGet(dp, 8 + k * 6, 0) * stepSize;cp->t[1][k] = ty[k] - cvmGet(dp, 9 + k * 6, 0) * stepSize;cp->t[2][k] = tz[k] - cvmGet(dp, 10 + k * 6, 0) * stepSize;

// convert the 3-vector [wx wy wz] of the Rodigrues representation// into the 3x3 rotation matrixRodrigues2R(wxNew, wyNew, wzNew, R);

for(m = 0; m < 3; m++){cp->r1[m][k] = cvmGet(R, m, 0);cp->r2[m][k] = cvmGet(R, m, 1);cp->r3[m][k] = cvmGet(R, m, 2);

}}//Evaluate the total geometric distance at the updated parameterseNew = CalculateError(cp, imagePt, dLM);

// if the total geometric distance of the updated parameters is// less than the previous one then makes the updated parameters// to be the current parameters and decreases// the value of the damping factorif(eNew < e){

lambda = lambda / 10;e = CalculateError(cp, imagePt, d);update = true;

}else{ // reverse updated dataupdate = false;lambda = lambda * 20;cp->alphaX = au;cp->alphaY = av;cp->x0 = u0;cp->y0 = v0;cp->skew = sk;cp->k1 = k1;cp->k2 = k2;for(k = 0; k < numOfImages; k++){

cp->t[0][k] = tx[k];cp->t[1][k] = ty[k];cp->t[2][k] = tz[k];

Rodrigues2R(wx[k], wy[k], wz[k], R);for(m = 0; m < 3; m++){

cp->r1[m][k] = cvmGet(R, m, 0);cp->r2[m][k] = cvmGet(R, m, 1);

52

cp->r3[m][k] = cvmGet(R, m, 2);}

}}

}printf("refined error = %f\n", e);cvReleaseMat(&R); cvReleaseMat(&J); cvReleaseMat(&Hessian);cvReleaseMat(&dp); cvReleaseMat(&d); cvReleaseMat(&dLM);cvReleaseMat(&invHessian); cvReleaseMat(&transJ); cvReleaseMat(&Jerr);

}

double CalculateError(CameraParam *cp, PairPointSet *imagePt, CvMat *d) {float au = cp->alphaX;float av = cp->alphaY;float u0 = cp->x0;float v0 = cp->y0;float sk = cp->skew;float k1 = cp->k1;float k2 = cp->k2;

int k, i;int numOfImages = imagePt->imageLen;int numOfPoints = imagePt->pointLen;double x1, x2, x3, u, v, X, Y, ui, vi;double tpU , tpV , r;

CvMat *A = cvCreateMat(3, 3, CV_64FC1);CvMat *K = cvCreateMat(3, 3, CV_64FC1);CvMat *H = cvCreateMat(3, 3, CV_64FC1);

for(k = 0; k < numOfImages; k++){//H = KA, where A = [r1 r2 t];for(i = 0; i < 3; i++){

cvmSet(A, i, 0, cp->r1[i][k]);cvmSet(A, i, 1, cp->r2[i][k]);cvmSet(A, i, 2, cp->t[i][k]);

}float kArr[9] = {au, sk, u0,

0, av, v0,0, 0, 1};

Array2CvMat(kArr, K, 3, 3);

cvMatMul(K, A, H);

float h[9];

53

CvMat2Array(H, h, 3, 3);

// get ideal pointsfor(i = 0; i < numOfPoints; i++){

X = imagePt->ptmJ[i][k];Y = imagePt->ptmI[i][k];

x1 = h[0] * X + h[1] * Y + h[2];x2 = h[3] * X + h[4] * Y + h[5];x3 = h[6] * X + h[7] * Y + h[8];// get ideal pointsui = x1 / x3;vi = x2 / x3;

// radial distortiontpU = (ui - u0);tpV = (vi - v0);r = pow(tpU / au , 2) + pow(tpV / av, 2);

u = ui + tpU * (k1 * r + k2 * pow(r, 2));v = vi + tpV * (k1 * r + k2 * pow(r, 2));

cvmSet(d, i*2 + k*numOfPoints*2, 0,imagePt->ptiJ[i][k] - u);

cvmSet(d, i*2 + 1 + k*numOfPoints*2, 0,imagePt->ptiI[i][k] - v);

}}

int numOfData = numOfPoints * numOfImages;double e = cvDotProduct(d, d) / numOfData;

cvReleaseMat(&A); cvReleaseMat(&H); cvReleaseMat(&K);

return(e);}

void Rodrigues2R(float wx, float wy, float wz, CvMat *R){CvMat* W = cvCreateMat(3, 3, CV_64FC1);CvMat* Ws = cvCreateMat(3, 3, CV_64FC1);CvMat* W2 = cvCreateMat(3, 3, CV_64FC1);CvMat* W2s = cvCreateMat(3, 3, CV_64FC1);CvMat* Rtmp = cvCreateMat(3, 3, CV_64FC1);CvMat* Eye = cvCreateMat(3, 3, CV_64FC1);

54

// convert the 3-vector [wx wy wz] of// the Rodigrues representation// into the 3x3 rotation matrixfloat norm2 = pow(wx, 2) + pow(wy, 2) + pow(wz, 2);float norm = sqrt(norm2);float omega[9] = {0, -wz, wy,

wz, 0, -wx,-wy, wx, 0};

Array2CvMat(omega, W, 3, 3);float a1 = (sin(norm) / norm);float a2 = ((1 - cos(norm)) / norm2);cvmScale(W, Ws, a1);cvMatMul(W, W, W2);cvmScale(W2, W2s, a2);

float I[9] = {1, 0, 0,0, 1, 0,0, 0, 1};

Array2CvMat(I, Eye, 3, 3);cvAdd(Eye, Ws, Rtmp); //R = Eye + Ws + W2s;cvAdd(Rtmp, W2s, R);

cvReleaseMat(&W); cvReleaseMat(&W2);cvReleaseMat(&Ws); cvReleaseMat(&W2s);cvReleaseMat(&Rtmp); cvReleaseMat(&Eye);

}

void WriteParameters(CameraParam *cp, char *name) {FILE *file;file = fopen(name, "w");fprintf(file, "intrinsic parameters\n");fprintf(file, "-------------------------\n");fprintf(file, "alphaX = %f\n", cp->alphaX);fprintf(file, "alphaY = %f\n", cp->alphaY);fprintf(file, "x0 = %f\n", cp->x0);fprintf(file, "y0 = %f\n", cp->y0);fprintf(file, "skew = %f\n", cp->skew);fprintf(file, "\n");

fprintf(file, "radial distortion\n");fprintf(file, "-------------------------\n");fprintf(file, "k1 = %f\n", cp->k1);fprintf(file, "k2 = %f\n", cp->k2);fprintf(file, "\n");

55

for(int k = 0; k < cp->len; k++){fprintf(file, "extrinsic parameters of image %d\n", k);fprintf(file, "-----------------------------------------\n");fprintf(file, "r1 = (%f, %f, %f)\n",

cp->r1[0][k], cp->r1[1][k], cp->r1[2][k]);fprintf(file, "r2 = (%f, %f, %f)\n",

cp->r2[0][k], cp->r2[1][k], cp->r2[2][k]);fprintf(file, "r3 = (%f, %f, %f)\n",

cp->r3[0][k], cp->r3[1][k], cp->r3[2][k]);fprintf(file, "t = (%f, %f, %f)\n",

cp->t[0][k], cp->t[1][k], cp->t[2][k]);fprintf(file, "\n");

}fclose(file);

}

3.8 Program Header File : jacobianCalculation.h

//// file : jacobianCalculation.h//---------------------------------------------------// this file contains a function that calculates// Jacobian form that is required in LM algorithm// for camera calibration//

void CalculateJocobian(double *Jx, double *Jy, int imageNum, int len,double X, double Y,double au, double av, double u0, double v0, double sk,double wx, double wy, double wz,double tx, double ty, double tz, double k1, double k2);

double Jx1(double X, double Y, double au, double av, double u0, double v0,double sk, double wx, double wy, double wz,double tx, double ty, double tz, double k1, double k2);

...

double Jy13(double X, double Y, double au, double av, double u0, double v0,double sk, double wx, double wy, double wz,double tx, double ty, double tz, double k1, double k2);

56

3.9 Program Function File : jacobianCalculation.cpp

Function Jx1, ..., Jy13 are obtained from Matlab. Here we do not present these functionsbecause these are too long.

//// file : jacobianCalculation.cpp//---------------------------------------------------// this file contains a function that calculates// Jacobian form that is required in LM algorithm// for camera calibration//

#include <math.h>#include "jacobianCalculation.h"

//// function : CalculateJacobian//------------------------------------------// this function returns the Jacobian// the Jacobian equations are obtained by// Matlab symbolic function "jacobian"////------------------------------------------// X : model point x// Y : model point y//// intrinsic parameters//--------------------------// au : alphaX// av : alphaY// u0 : x0// v0 : y0// sk : skew//// extrinsic parameters//--------------------------// wx, wy, wz : rotation// tx, ty, tz : translation//// radial distortion//-------------------------// k1, k2//void CalculateJocobian(double *Jx, double *Jy, int imageNum, int len,

double X, double Y,double au, double av, double u0, double v0, double sk,double wx, double wy, double wz,

57

double tx, double ty, double tz, double k1, double k2){

// Jx & Jy size : 7 + numOfImage * 6//(7 intrinsic camera parameters including k1 & k2,// 6 extrinsic parameters depending on images)

int i;int JSize = len; // 7 + numOfImage * 6Jx[0] = Jx1(X, Y, au, av, u0, v0, sk, wx, wy, wz, tx, ty, tz, k1, k2);Jx[1] = Jx2(X, Y, au, av, u0, v0, sk, wx, wy, wz, tx, ty, tz, k1, k2);Jx[2] = -1.0; //Jx3 = -1.0;Jx[3] = 0.0; //Jx4 = 0.0;Jx[4] = Jx5(X, Y, au, av, u0, v0, sk, wx, wy, wz, tx, ty, tz, k1, k2);

Jy[0] = Jy1(X, Y, au, av, u0, v0, sk, wx, wy, wz, tx, ty, tz, k1, k2);Jy[1] = Jy2(X, Y, au, av, u0, v0, sk, wx, wy, wz, tx, ty, tz, k1, k2);Jy[2] = Jy3(X, Y, au, av, u0, v0, sk, wx, wy, wz, tx, ty, tz, k1, k2);Jy[3] = Jy4(X, Y, au, av, u0, v0, sk, wx, wy, wz, tx, ty, tz, k1, k2);Jy[4] = Jy5(X, Y, au, av, u0, v0, sk, wx, wy, wz, tx, ty, tz, k1, k2);

for(i = 5; i < JSize; i++){Jx[i] = 0;Jy[i] = 0;

}

Jx[5 + imageNum * 6] = Jx6(X, Y, au, av, u0, v0, sk,wx, wy, wz, tx, ty, tz, k1, k2);

Jx[6 + imageNum * 6] = Jx7(X, Y, au, av, u0, v0, sk,wx, wy, wz, tx, ty, tz, k1, k2);

Jx[7 + imageNum * 6] = Jx8(X, Y, au, av, u0, v0, sk,wx, wy, wz, tx, ty, tz, k1, k2);

Jx[8 + imageNum * 6] = Jx9(X, Y, au, av, u0, v0, sk,wx, wy, wz, tx, ty, tz, k1, k2);

Jx[9 + imageNum * 6] = Jx10(X, Y, au, av, u0, v0, sk,wx, wy, wz, tx, ty, tz, k1, k2);

Jx[10 + imageNum * 6] = Jx11(X, Y, au, av, u0, v0, sk,wx, wy, wz, tx, ty, tz, k1, k2);

Jy[5 + imageNum * 6] = Jy6(X, Y, au, av, u0, v0, sk,wx, wy, wz, tx, ty, tz, k1, k2);

Jy[6 + imageNum * 6] = Jy7(X, Y, au, av, u0, v0, sk,wx, wy, wz, tx, ty, tz, k1, k2);

Jy[7 + imageNum * 6] = Jy8(X, Y, au, av, u0, v0, sk,wx, wy, wz, tx, ty, tz, k1, k2);

Jy[8 + imageNum * 6] = Jy9(X, Y, au, av, u0, v0, sk,wx, wy, wz, tx, ty, tz, k1, k2);

58

Jy[9 + imageNum * 6] = Jy10(X, Y, au, av, u0, v0, sk,wx, wy, wz, tx, ty, tz, k1, k2);

Jy[10 + imageNum * 6] = Jy11(X, Y, au, av, u0, v0, sk,wx, wy, wz, tx, ty, tz, k1, k2);

Jx[JSize - 2] = Jx12(X, Y, au, av, u0, v0, sk,wx, wy, wz, tx, ty, tz, k1, k2); // for k1

Jx[JSize - 1] = Jx13(X, Y, au, av, u0, v0, sk,wx, wy, wz, tx, ty, tz, k1, k2); // for k2

Jy[JSize - 2] = Jy12(X, Y, au, av, u0, v0, sk,wx, wy, wz, tx, ty, tz, k1, k2); // for k1

Jy[JSize - 1] = Jy13(X, Y, au, av, u0, v0, sk,wx, wy, wz, tx, ty, tz, k1, k2); // for k2

}

3.10 Program Header File : utility.h

//// file : utility.h//-----------------------// this file contains utility functions to// deal with general processes.//

#define OFF 0#define ON 1#define EPS 0.5#define IMPLEMENTATION 2#define min(a, b) ((a <= b) ? a : b)#define max(a, b) ((a>= \b) ? a : b)#define MAX_POINT_SIZE 500

typedef struct{int len;int pt1I[MAX_POINT_SIZE];int pt1J[MAX_POINT_SIZE];int pt2I[MAX_POINT_SIZE];int pt2J[MAX_POINT_SIZE];

}PairPoints;

typedef struct{int len;int ptI[MAX_POINT_SIZE];int ptJ[MAX_POINT_SIZE];

59

}Points;

void Array2CvMat(float *arr, CvMat *cvArr, int row, int column);

void CvMat2Array(CvMat *cvArr, float *arr, int row, int column);

void CvImageCopyFloat2Uchar(IplImage *src, IplImage *dst);

void CvImageCopyUchar2Float(IplImage *src, IplImage *dst);

void InitializeImage(IplImage *image);

void CombineTwoImages(IplImage *image1, IplImage *image2,IplImage *outImage);

void WriteImage(IplImage *image, char *imageName);

void MakeImageBlock(IplImage *block, IplImage *image, int centPosI, intcentPosJ);

void TransformImage(IplImage *inImage, IplImage *outImage, CvMat *H);

void CrossPointOf2Lines(float *line1, float *line2, float *pt);

void PassLineOf2Points(float *pt1, float *pt2, float *line);

void InitializePairPoints(PairPoints *corspMap);

void CopyPairPoints(PairPoints *dst, PairPoints *src);

void UpdatePairPoints(PairPoints *corspMap, int domainPosI, int domainPosJ,int rangePosI, int rangePosJ);

void LeastSquare(CvMat *A, CvMat*y, CvMat *xLS);

3.11 Program Function File : utility.cpp

//// file : utility.cpp//-----------------------// this file contains utility functions to// deal with general processes.//

#include <stdlib.h>#include <stdio.h>#include <math.h>

60

#include "cv.h"#include "cxcore.h"#include "highgui.h"#include "utility.h"

void Array2CvMat(float *arr, CvMat *cvArr, int row, int column) {int i, j;

for(i = 0; i < row; i++){for(j = 0; j < column; j++){

cvmSet(cvArr, i, j, arr[i*column + j]);}

}}

void CvMat2Array(CvMat *cvArr, float *arr, int row, int column) {int i, j;

for(i = 0; i < row; i++){for(j = 0; j < column; j++){

arr[i*column + j] = cvmGet(cvArr, i, j);}

}}

void CvImageCopyFloat2Uchar(IplImage *src, IplImage *dst) {int i, j, k;float pixel;int height = src->height;int width = src->width;int channels = src->nChannels;int step = dst->widthStep;uchar *dstData = (uchar *)dst->imageData;float *srcData = (float *)src->imageData;

// copy float precision image to uchar precision imagefor(i = 0; i < height; i++){

for(j = 0; j < width; j++){for(k = 0; k < channels; k++){

pixel = srcData[i*step + j*channels + k];pixel = (pixel > 255 ? 255 : pixel);pixel = (pixel < 0 ? 0 : pixel);dstData[i*step + j*channels + k] = (uchar)pixel;

}

61

}}

}

void CvImageCopyUchar2Float(IplImage *src, IplImage *dst) {int i, j, k;int height = src->height;int width = src->width;int channels = src->nChannels;int step = src->widthStep;float *dstData = (float *)dst->imageData;uchar *srcData = (uchar *)src->imageData;

// copy uchar precision image to float precision imagefor(i = 0; i < height; i++){

for(j = 0; j < width; j++){for(k = 0; k < channels; k++){

dstData[i*step + j*channels + k]= (float)srcData[i*step + j*channels + k];

}}

}}

void InitializeImage(IplImage *image) {int i, j, k;int height = image->height;int width = image->width;int channels = image->nChannels;int step = image->widthStep;uchar *imageData = (uchar *)image->imageData;

for(i = 0; i < height; i++){for(j = 0; j < width; j++){

for(k = 0; k < channels; k++){imageData[i*step + j*channels + k]= 0;

}}

}}

void CombineTwoImages(IplImage *image1, IplImage *image2,IplImage *outImage)

62

{int i, j, k;uchar *outImageData = 0, *image1Data = 0, *image2Data = 0;

int height = image1->height;int width = image1->width;int step = image1->widthStep;int channels = image1->nChannels;int outWidth = outImage->width;int outHeight = outImage->height;int outStep = outImage->widthStep;

if(outWidth == width * 2 && outHeight == height){}else if(outWidth == width && outHeight == height * 2){}else{

printf("image combining error\n");exit(0);

}

outImageData = (uchar *)outImage->imageData;image1Data = (uchar *)image1->imageData;image2Data = (uchar *)image2->imageData;

for(i = 0; i < outHeight; i++){for(j = 0; j < outWidth; j++){

for(k = 0; k < channels; k++){if(i < height && j < width){

outImageData[i*outStep + j*channels + k]= image1Data[i*step + j*channels + k];

}else if((i >= height && j < width)){outImageData[i*outStep + j*channels + k]= image2Data[(i-height)*step + j*channels + k];

}else if((i < height && j >= width)){outImageData[i*outStep + j*channels + k]= image2Data[i*step + (j-width)*channels + k];

}else{printf("there is no i > height & j > width \n");exit(0);

}}

}}

}

void WriteImage(IplImage *image, char *imageName) {if(!cvSaveImage(imageName, image)){

63

printf("Could not save: %s\n", imageName);}

}

//// function : MakeImageBlock// usage : MakeImageBlock(block, image, centPosI, centPosJ);// ------------------------------------------------------------// This function copies a block region of the image into a block// for example, if block size is 3 by 3 and the position of the block// is i, j on the image. the resultant block will be 3 by 3 and// the block will be copied by image(i-1, j-1) ... image(i+1, j+1).//void MakeImageBlock(IplImage *block, IplImage *image, int centPosI, intcentPosJ) {

uchar *blockData = 0, *imageData = 0;int blockHeight, blockWidth, imageHeight, imageWidth;int blockStep, channels, imageStep;int i, j, k, posI, posJ;

blockHeight = block->height;blockWidth = block->width;imageHeight = image->height;imageWidth = image->width;channels = block->nChannels;blockStep = block->widthStep;imageStep = image->widthStep;blockData = (uchar *)block->imageData;imageData = (uchar *)image->imageData;

for(i = 0; i < blockHeight; i++){for(j = 0; j < blockWidth; j++){

for(k = 0; k < channels; k++){posI = centPosI + i - blockHeight / 2;posJ = centPosJ + j - blockWidth / 2;posI = min(max(posI, 0), imageHeight - 1);posJ = min(max(posJ, 0), imageWidth - 1);

blockData[i*blockStep + j*channels + k]= imageData[posI*imageStep + posJ*channels + k];

}}

}}

//

64

// function : TransformImage// usage : TransformImage(inImage, outImage, H);// ---------------------------------------------// This function transforms input image using H//void TransformImage(IplImage *inImage, IplImage *outImage, CvMat *H) {

uchar *inData;uchar *outData;int height, width, step, channels;int i, j, k;

// get the input image dataheight = inImage->height;width = inImage->width;step = inImage->widthStep;channels = inImage->nChannels;inData = (uchar *)inImage->imageData;outData = (uchar *)outImage->imageData;

// apply the transform to get the target image// -----------------------------------------------------// out(x’) = in(x) : has 2 implementation forms// case 1 : out(Hx) = in(x)// case 2 : out(x’) = inv(H)x’CvMat *invH = cvCreateMat(3, 3, CV_32FC1);cvInvert(H, invH);float h[9];if(IMPLEMENTATION == 1){ // case 1 : out(Hx) = in(x)

CvMat2Array(H, h, 3, 3);}else{ // case 2 : x = inv(H)x’

CvMat2Array(invH, h, 3, 3);}

int ii, jj;float x1, x2, x3;for(i = 0; i < height; i++){ // case 1 : i, j : x, ii, jj : x’, x’ = Hx

for(j = 0; j < width; j++){ // case 2 : i, j : x’, ii, jj : x, x = invHx’for(k = 0; k < channels; k++){ // x : domain, x’ : range

x1 = h[0] * j + h[1] * i + h[2];x2 = h[3] * j + h[4] * i + h[5];x3 = h[6] * j + h[7] * i + h[8];ii = min(height - 1, max(0, (int)(x2 / x3)));jj = min(width - 1, max(0, (int)(x1 / x3)));if(IMPLEMENTATION == 1){ // case 1 : out(Hx) = in(x)

outData[ii*step + jj*channels + k]= inData[i*step + j*channels + k];

65

}else{ // case 2 : out(x’) = in(inv(H)x’)if(ii == 0 || ii == height -1 || jj == 0 || jj == width - 1){

outData[i*step + j*channels + k] = 0;}else{

outData[i*step + j*channels + k]= inData[ii*step + jj*channels + k];

}}

}}

}}

//// function : CrossPointOf2Lines// usage : CrossPointOf2Lines(line1, line2, pt);//-------------------------------------------------------------// this function calculates the cross point of line1 and line2//void CrossPointOf2Lines(float *line1, float *line2, float *pt) {

CvMat *cvLine1 = cvCreateMat(3, 1, CV_32FC1);CvMat *cvLine2 = cvCreateMat(3, 1, CV_32FC1);CvMat *cvPt = cvCreateMat(3, 1, CV_32FC1);Array2CvMat(line1, cvLine1, 3, 1);Array2CvMat(line2, cvLine2, 3, 1);

cvCrossProduct(cvLine1, cvLine2, cvPt);CvMat2Array(cvPt, pt, 3, 1);

}

//// function : PassLineOf2Points// usage : PassLineOf2Points(pt1, pt2, line);//-----------------------------------------------------------------// this function calculates the line that pass through pt1 and pt2//void PassLineOf2Points(float *pt1, float *pt2, float *line) {

// this function is dual of CrossPointsOf2LinesCrossPointOf2Lines(pt1, pt2, line);

}

void InitializePairPoints(PairPoints *corspMap) {

for(int i = 0; i < MAX_POINT_SIZE; i++){

66

corspMap->pt1I[i] = 0;corspMap->pt1J[i] = 0;corspMap->pt2I[i] = 0;corspMap->pt2J[i] = 0;

}

corspMap->len = 0;}

void CopyPairPoints(PairPoints *dst, PairPoints *src) {for(int i = 0; i < MAX_POINT_SIZE; i++){

dst->pt1I[i] = src->pt1I[i];dst->pt1J[i] = src->pt1J[i];dst->pt2I[i] = src->pt2I[i];dst->pt2J[i] = src->pt2J[i];

}

dst->len = src->len;}

void UpdatePairPoints(PairPoints *corspMap, int domainPosI, int domainPosJ,int rangePosI, int rangePosJ)

{int len;

len = corspMap->len;if(corspMap->len >= MAX_POINT_SIZE){

printf("UpdateCorrespMap called on a full corspMap\n");printf("Next positions of correspondences will be overwritten\n");printf("in the current correspondece \n");len = MAX_POINT_SIZE - 1;

}

corspMap->pt2I[len] = rangePosI;corspMap->pt2J[len] = rangePosJ;corspMap->pt1I[len] = domainPosI;corspMap->pt1J[len] = domainPosJ;corspMap->len = len + 1;

}

//// function : LeastSquare// usage : LeastSquare(A, y, xLS);//-------------------------------------------------

67

// this function returns the least squre solution of// Ax = y ; xLs = inv(A’A)A’y//void LeastSquare(CvMat *A, CvMat*y, CvMat *xLS){

int n, m; // A is n by mCvSize size = cvGetSize(A);

m = size.width;n = size.height;CvMat *B = cvCreateMat(m, m, CV_64FC1);CvMat *invB = cvCreateMat(m, m, CV_64FC1);CvMat *transA = cvCreateMat(m, n, CV_64FC1);CvMat *C = cvCreateMat(m, n, CV_64FC1);

cvMulTransposed (A, B, 1);cvInvert(B, invB, CV_SVD);cvTranspose(A, transA);cvMatMul(invB, transA, C);cvMatMul(C, y, xLS);

// release matricescvReleaseMat(&B);cvReleaseMat(&invB);cvReleaseMat(&transA);cvReleaseMat(&C);

}

68