Globally Optimal Estimates for Geometric Reconstruction Problems

Post on 10-Jan-2016

39 views 3 download

description

Globally Optimal Estimates for Geometric Reconstruction Problems. Tom Gilat, Adi Lakritz Advanced Topics in Computer Vision Seminar Faculty of Mathematics and Computer Science Weizmann Institute 3 June 2007. outline. Motivation and Introduction Background - PowerPoint PPT Presentation

Transcript of Globally Optimal Estimates for Geometric Reconstruction Problems

Globally Optimal Estimates for Geometric Reconstruction

ProblemsTom Gilat, Adi Lakritz

Advanced Topics in Computer Vision Seminar

Faculty of Mathematics and Computer Science

Weizmann Institute

3 June 2007

outline Motivation and Introduction

Background Positive SemiDefinite matrices (PSD) Linear Matrix Inequalities (LMI) SemiDefinite Programming (SDP)

Relaxations Sum Of Squares (SOS) relaxation Linear Matrix Inequalities (LMI) relaxation

Application in vision Finding optimal structure Partial relaxation and Schur’s complement

Motivation

Geometric Reconstruction Problems

Polynomial optimization problems (POPs)

Triangulation problem in L2 norm

points image theof worldin the source the,for estimation

:Goal

pointes image ingcorrespond...

matrices camera 43 - ,...,

:Given

21

1

X

xxx

PP

n

n

Multi view - optimization

2x3x

2-views – exact solution

X

1x

Triangulation problem in L2 norm

X

perspective camera i

X

x

yz

ix

0

'

i

iii XPx

ix' err

points image theof worldin the source the,for estimation

:Goal

pointes image ingcorrespond...

matrices camera 43 - ,...,

:Given

21

1

X

xxx

PP

n

n

Triangulation problem in L2 norm

minimize reprojection error in all cameras

convexnon - ))(,()err( :functionerror 1

22

n

iii XPxdXL

convex - 0

cameras all offront in points all isdomain

i

X

x

PX

0 subject to )err(min i X Polynomial

minimization problem

Non convex

More computer vision problems

• Reconstruction problem:known cameras, known corresponding pointsfind 3D points that minimize the projection error of given image points– Similar to triangulation for many points and cameras

• Calculating homographygiven 3D points on a plane and corresponding image points, calculate homography

• Many more problems

Optimization problems

Introduction to optimization problems

m 1 :f ,x

sconstraint– m 1 ,b (x) subject to

function objective– (x) minimize

ni

n

ii

0

iRRR

if

f

set feasiblein vectorsall amongsmallest is )~(

set feasible~ if optimal is ~

sconstraint esatisfy th that vectorsall set feasible

0 xf

x

x

optimization problems

0 s.t. }1{),...,( thereareor

d?partitione be ,..., sequence can the

1

1

ii

in

n

n

xaxxx

aa

?0 }))1(()()({ min 222 i

iii

i xxaxf

NP - complete

optimization problems

optimization

convex non convex

Linear

Programming

(LP)

SemiDefinite

Programming

(SDP)

solutions exist:

interior point methods

problems: local optimum

or high computational cost

non convex optimization

init

level curves of f

Many algorithms

Get stuck in

local minima

MinMax

Non convex feasible set

optimization problems

optimization

convex non convex

LP SDP

solutions exist:

interior point methods

problems: local optimum

or high computational cost

global optimization – algorithms that converge to optimal solution

relaxation of problem

outline Motivation and Introduction

Background Positive SemiDefinite matrices (PSD) Linear Matrix Inequalities (LMI) SemiDefinite Programming (SDP)

Relaxations Sum Of Squares (SOS) relaxation Linear Matrix Inequalities (LMI) relaxation

Application in vision Finding optimal structure Partial relaxation and Schur’s complement

positive semidefinite (PSD) matrices

0)()(: 2 xAxAxAxAAxMxxAAM TTTTTTTT

0 MxxRx Tn

0by denoted M

Definition: a matrix M in Rn×n is PSD if

1. M is symmetric: M=MT

2. for all

M can be decomposed as AAT (Cholesky)

Proof:

s)eigenvalue enonnegativ and theorm(spectral :

positive semidefinite (PSD) matrices

0 MxxRx Tn

0by denoted M

Definition: a matrix M in Rn×n is PSD if

1. M is symmetric: M=MT

2. for all

M can be decomposed as AAT (Cholesky)

nT RvvvM , then 1rank is M if

principal minors

The kth order principal minors of an n×n symmetric matrix M are the

determinants of the k×k matrices obtained by deleting n - k rows and

the corresponding n - k columns of M

first order: elements on diagonal

second order:

333231

232221

131211

MMM

MMM

MMM

333231

232221

131211

MMM

MMM

MMM

diagonal minors

The kth order principal minors of an n×n symmetric matrix M are the

determinants of the k×k matrices obtained by deleting n - k rows and

the corresponding n - k columns of M

first order: elements on diagonal

second order:

333231

232221

131211

MMM

MMM

MMM

333231

232221

131211

MMM

MMM

MMM

diagonal minors

The kth order principal minors of an n×n symmetric matrix M are the

determinants of the k×k matrices obtained by deleting n - k rows and

the corresponding n - k columns of M

first order: elements on diagonal

second order:

333231

232221

131211

MMM

MMM

MMM

333231

232221

131211

MMM

MMM

MMM

diagonal minors

enonnegativ are M of minors principal theall iff 0 Mmatrix A

The kth order principal minors of an n×n symmetric matrix M are the

determinants of the k×k matrices obtained by deleting n - k rows and

the corresponding n - k columns of M

first order: elements on diagonal

third order: det(M)

second order:

333231

232221

131211

MMM

MMM

MMM

333231

232221

131211

MMM

MMM

MMM

Set of PSD matrices in 2D

0

zy

yx2 ,0, yxzzx

Set of PSD matrices

This set is convex

00)(00,0 tMxMxtMxxtM TT

00)(

0, 0,

LMxLMxx

LxxMxxxLMT

TT

0)1( so LttM

Proof:

LMI – linear matrix inequality

convex is }0|{Kset feasible A(x)Rx n

)(xAnR

matrices symmetrick k are A ,R)x,...,(xx

0)(

in

n1

10

n

iii AxAxA

matricesk k

KyttxyttxA

yAtxtA

A(x),A(y)tKx,y

)1(0))1((

0)()1()(

00,

:Proof

LMI

0

10

02

1

100

011

010

001

001

111

100

020

001

21

221

1211

21

xx

xxx

xxxx

xx

example: find the feasible set of the 2D LMI

)(xA2A1A0A

reminder

enonnegativ are M of minors principal theall iff 0 Mmatrix A

LMI

0

10

02

1

100

011

010

001

001

111

100

020

001

21

221

1211

21

xx

xxx

xxxx

xx

01 ,02 ,01 221 xxx

example: find the feasible set of the 2D LMI

1st order principal minors

)(xA2A1A0A

LMI

0

10

02

1

100

011

010

001

001

111

100

020

001

21

221

1211

21

xx

xxx

xxxx

xx

)(xA

example: find the feasible set of the 2D LMI

2A1A0A

0)1 )(1(

,0)1 )(2(

,0)()2 )(1(

2121

22

22121

xxx

xx

xxxx

2nd order principal minors

LMI

0

10

02

1

100

011

010

001

001

111

100

020

001

21

221

1211

21

xx

xxx

xxxx

xx

)(xA

example: find the feasible set of the 2D LMI

2A1A0A

0)2 (

))()2 )(1)((1(

221

221211

xxx

xxxxx

3rd order principal minors

Intersection of all inequalities

Semidefinite Programming (SDP) = LMI

an extension of LP

0)( tosub.x minimize :SDP T xb, BAxc

bAxxcT tosub. minimize :LP

outline Motivation and Introduction

Background Positive SemiDefinite matrices (PSD) Linear Matrix Inequalities (LMI) SemiDefinite Programming (SDP)

Relaxations Sum Of Squares (SOS) relaxation Linear Matrix Inequalities (LMI) relaxation

Application in vision Finding optimal structure Partial relaxation and Schur’s complement

Sum Of Squares relaxation (SOS)

Unconstrained polynomial optimization problem (POP)

means the feasible set is Rn

H. Waki, S. Kim, M. Kojima, and M. Muramatsu. SOS and SDP relaxations for POPs with structured sparsity. SIAM J. Optimization, 2006.

Sum Of Squares relaxation (SOS)

}polynomial a is )( and 0)(|{

:Define

xfRxxffN n

})()( s.t. )(),...,( spolynomial|{1

21

n

iin xgxfxgxgfSOS

NSOSNSOS ,

rare is \)( SOSNxf

SOS relaxation for unconstrained polynomials

Npxfp )( s.t. max find :P')(min find :P xf

)(min is 0)(s.t max so

holdnot will thisn larger thay any for and

0)(

then)(min if

:Proof

x fpx fp

p

pxfx

x fp

SOS relaxation for unconstrained polynomials

Npxfp )( s.t. max find :P')(min find :P xf

relaxation - )( s.t. max find :'P' SOSqxfq

)(xf

q

x

p

SDPby solves becan 'P'

: on bound guarantees qpp

monomial basis

r

rnd(r)xvr ))(dim(

22

21

21

2

1

2

1

)(

x

xx

x

x

x

xv

T,x,xx,x,x,x

xxx-xx-xf(x)

)1)(6,4,5,3,2,1(

65432122

212121

2221

2121

then degree of is if )(rdr

T Ra(x) where vaf(x)rf(x) example

SOS relaxation to SDP

}0)()({

})()({

}|))(({

}))(deg( )({

2r degree of spolynomial SOS ofset

)(

1

)(2

1

1

2

2

| VxVvxv

R | axvaaxv

Raxva

rxg|xg

SOS

rT

r

rdir

n

i

Tii

Tr

rdi

n

ir

Ti

i

n

ii

r

0V s.t. )()()( tosub. max xVvxvqxfq rT

rSDP

SOS relaxation to SDP

0 ,

11

)(

22

21

21

2

1

666564636261

565554535251

464544434241

363534333231

262524232221

161514131211

22

21

21

2

1

V

x

xx

x

x

x

VVVVVV

VVVVVV

VVVVVV

VVVVVV

VVVVVV

VVVVVV

x

xx

x

x

x

qxf

T

0V 0,V else ,VV24 ,2V3- ,2V2 ,V-

tosub. max

ii5546131211 q

q

22

2121 432 xxx-xf(x) example:

SDP

SOS for constrained POPs

possible to extend this method for constrained POPs

by use of generalized Lagrange dual

SOS relaxation summary

So we know how to solve a POP that is a SOS

And we have a bound on a POP that is not an SOS

H. Waki, S. Kim, M. Kojima, and M. Muramatsu. SOS and SDP relaxations for POPs with structured sparsity. SIAM J. Optimization, 2006.

POP SOS

problem

SOS

relaxationGlobal

estimate

SDP

relaxations

SOS:

POP SOS

problem

SOS

relaxationGlobal

estimate

SDP

POP linear & LMI

problem

LMI

relaxationGlobal

estimate

SDP +

converge

LMI:

outline Motivation and Introduction

Background Positive SemiDefinite matrices (PSD) Linear Matrix Inequalities (LMI) SemiDefinite Programming (SDP)

Relaxations Sum Of Squares (SOS) relaxation Linear Matrix Inequalities (LMI) relaxation

Application in vision Finding optimal structure Partial relaxation and Schur’s complement

LMI relaxations

Constraints are handled

Convergence to optimum is guaranteed

Applies to all polynomials, not SOS as well

A maximization problem

Note that:a.Feasible set is non-convex. b.Constraints are quadratic

Feasible set

01)(

0)(

023)(s.t.

)(max

213

21212

22

2121

20

xxxg

xxxxxg

xxxxg

xxg

LMI – linear matrix inequality, a reminder

convex is }0|{Kset feasible A(x)Rx n

)(xAnR

matrices symmetrick k are A ,R)x,...,(xx

0)(

in

n1

10

n

iii AxAxA

matricesk k

Goal

An SDP:

Motivation

bMx

AxAxA

xc

nn

T

0...s.t.

min

110

Polynomial Optimization

Problem

SDP with solution close

to global optimum of the

original problem

What is it good for?

SDP problems can be solved much more efficiently then general

optimization problems.

POP

Linear + LMI + rank constraints

SDP

LMI:

LMI Relaxations is iterative process

Step 1: introduce new variables

Step 2: relax constraints

Apply higher

order

relaxations

LMI relaxation – step 1

Replace monomials by “lifting variables”ji xx 21

Rule:

ijji yxx 21

02

11

20

01

10

22

21

21

2

1

y

y

y

y

y

x

xx

x

x

x

Example:

0)(0)( 110110221212 yyyxgxxxxxg

(the R2 case)

01)(

0)(

023)(s.t.

)(max

213

21212

22

2121

20

xxxg

xxxxxg

xxxxg

xxg

01)(

0)(

023)(s.t.

)(max

113

1101102

0220011

010

yxg

yyyxg

yyyxg

yxg

Introducing lifting variables

Lifting

01)(

0)(

023)(s.t.

)(max

113

1101102

0220011

010

yxg

yyyxg

yyyxg

yxg

Not equivalent to the original problem.

Lifting variables are not independent in the original problem:

New problem is linear, in particular convex

01)(

0)(

023)(s.t.

)(max

113

1101102

0220011

010

yxg

yyyxg

yyyxg

yxg

2111201110 xxyxyxy

100111 yyy

Goal, more specifically

Linear problem

(obtained by lifing)

“relations constraints” on lifting variables

+

SDP

01)(

0)(

023)(s.t.

)(max

113

1101102

0220011

010

yxg

yyyxg

yyyxg

yxg

) :demand weexample,(For 100111 yyy

Relaxation

Question: how do we guarantee that the relations between lifting variables hold?

on..... so and101020

100111

yyy

yyy

LMI relaxation – step 2

021101

112010

01101

yyy

yyy

yy

MApply lifting and get:

Take the basis of the degree 1 polynomials.

Note that:

Txxxv ],,1[)( 211

0

1

)()(22212

21211

21

11

xxxx

xxxx

xx

xvxv T

Because: te)semidefini positive is(0 AAvvA T

If the relations constraints hold then

0

1

021101

112010

0110

yyy

yyy

yy

M

This is because we can decompose M as follows:

],,1[],,1[

1

01100110

021101

112010

0110

yyyy

yyy

yyy

yy

M T

on...) so and ( 110110 yyy Assuming relations hold

Rank M = 1

We’ve seen:

Relations constraints

hold

0

1

021101

112010

0110

yyy

yyy

yy

M

What about the opposite:

0

1

021101

112010

0110

yyy

yyy

yy

MRelations constraints

hold

1Mrank

1Mrank

This is true as well

By the following:

LMI relaxation – step 2, continued

0

1

021101

112010

0110

yyy

yyy

yy

MRelations constraints

hold

1Mrank

1 and0 MRankMvvM T

All relations equalities are in the set

of equalities ijT

ij vvM ][][

Conclusion of the analysis

01)(

0)(

023)(s.t.

)(max

113

1101102

0220011

010

yxg

yyyxg

yyyxg

yxg

021101

112010

01101

yyy

yyy

yy

M

1,0 MrankM The “y feasible set”Subset of feasible set with

Relations constraints hold

Original problem is equivalent to the following:

together with the additional constraint

0)(

01)(

0)(

023)(s.t.

)(max

1

113

1101102

0220011

010

yM

yxg

yyyxg

yyyxg

yxg

021101

112010

0110

1

1

)(

yyy

yyy

yy

MyM

Relaxation, at last

We denote moment matrix

of order 1

1)(1 yMrank

Relax by dropping the non-convex constraint 1)(1 yMrank

LMI relaxation of order 1

Feasible set

0)(

01)(

0)(

023)(s.t.

)(max

1

113

1101102

0220011

010

yM

yxg

yyyxg

yyyxg

yxg

Rank constrained LMI vs. unconstrained

yx

xX

1

LMI relaxations of higher order

It turns out that we can do better:

Apply LMI relaxations of higher order

A tighter SDP

Relaxations of higher order incorporate the inequality constraints in LMI

• We show relaxation of order 2

• It is possible to continue and apply relaxations

• Theory guarantees convergence to global optimum

Let be a basis of polynomials

of degree 2.

Again,

Lifting gives:

Again, we will relax by dropping the rank constraint.

LMI relaxations of second order

Txxxxxxxv ],,,,,1[)( 2221

21212

0)()( 22 Txvxv

Inequality constraints to LMIReplace our constraints by LMIs and have a tighter relaxation.

Linear Constraint : 01)( 213 xxxg 01)( 113 yygLifting

LiftingLMI Constraint : 0)()()( 113 Txvxvxg 0)( 31 ygM

For example, 211010111031231 )1()()( yyyyyygygM

This procedure brings a new SDP

LMI relaxations of order 2

Second SDP feasible set is included in the first SDP feasible set

Silimarly, we can continue and apply higher order relaxations.

0)(

01)(

0)(

023)(s.t.

)(max

1

113

1101102

0220011

010

yM

yxg

yyyxg

yyyxg

yxg

0)(

,0)(,0)(,0)(s.t.

max

2

312111

01

yM

ygMygMygM

y

If the feasible set defined by constraints is compact, then under mild additional assumptions, Lassere proved in 2001 that there is an asymptotic convergence guarantee:

0)( xgi

gpkklim

kp is the solution to k’th relaxation

g is the solution for the original problem (finding a maximum)

Theoretical basis for the LMI relaxations

Moreover, convergence is fast: kp is very close to g for small k

Lasserre J.B.  (2001) "Global optimization with polynomials and the problem of moments"  SIAM J. Optimization 11, pp 796--817.

The method provides a

certificate of global optimality:

An important experimental observation:

SDP solution is global

optimum

Minimizing

Low rank moment matrix

Checking global optimality

1)( yMrank n

))(( yMtrace n

We add to the objective function the trace of the moment matrix weighted by a sufficiently small positive scalar

0)(

,0)(,0)(,0)(s.t.

))((max

2

312111

201

yM

ygMygMygM

yMtracey

LMI relaxations in vision

Application

outline Motivation and Introduction

Background Positive SemiDefinite matrices (PSD) Linear Matrix Inequalities (LMI) SemiDefinite Programming (SDP)

Relaxations Sum Of Squares (SOS) relaxation Linear Matrix Inequalities (LMI) relaxation

Application in vision Finding optimal structure Partial relaxation and Schur’s complement

point measured theis ~upoint dreprojecte theisu

Camera center

Image plane

Finding optimal structure A perspective camera

PUu

P is the camera matrix and is the depth.

Measured image points are corrupted by independent Gaussian noise.

We want to minimize the least squares errors between measured and projected points.

The relation between a U in the 3D space and u in the image plane is given by:

Niui ..1,~

0)(..))(,~(min1

2

xtsxuud i

N

iii

),( d

We therefore have the following optimization problem:

is the Euclidean distance.

x is the set of unknowns

2

22

212

)(

)()())(,~(

x

xfxfxuud

i

iiii

)(),(),( 21 xxfxf iii Where are polynomials.

Each term in the cost function can be written as:

Our objective is therefore to minimize a sum of rational functions.

How can we turn the optimization problem into a polynomial optimization problem?

Suppose that each term in

N

iii xuud

1

2))(,~( has an upper bound , theni

ii

ii

x

xfxf

2

22

21

)(

)()(

Then our optimization problem is equivalent to the following:

Nix

xxfxfts

i

iiii

N

..10)(

)()()(..

...min22

22

1

21

This is a polynomial optimization problem, for which we apply LMI relaxations.

Note that we introduced many new variables – one for each term.

Problem: an SDP with a large number of variables can be computationally demanding.

A large number of variables can arise from:

LMI relaxations of high order

Introduction of new variables as we’ve seen

This is where partial relaxations come in.

For that we introduce Schur’s complement.

Partial Relaxations

Schur’s comlement

Set:

i

i

i

i

i

C

xf

xfB

x

xA

)(

)(

)(0

0)(

2

1

2

2

0

0

C

CB

BAT

0

01

C

BABC T

0)(

0

)()(

)()(0

)(0)(

21

22

12

x

xfxf

xfx

xfx

i

iii

ii

ii

0

0

C

CB

BAT

0

01

C

BABC T

Schur’s comlement - applying

0)(

)()()( 222

21

x

xxfxf

i

iiii

0)(

)(

)(0

0)()()(

2

1

2

2

21

xf

xf

x

xxfxf

i

i

i

iiii

Derivation of right side:C - BT * A-1 * B > 0

Schur’s complement allows us to state our optimization problem as follows:

Nix

xfxf

xfx

xfx

ts

i

iii

ii

ii

N

..10)(

0

)()(

)()(0

)(0)(

..

...min

21

22

12

21

Partial relaxations

The only non-linearity is due to 2)(xi

We can apply LMI relaxations only on Txxx ,...],[ 21 and leave Nii ..1,

If we were to apply full relaxations for all variables, the problem would become

Intractable for small N.

Partial relaxations

Disadvantage of partial relaxations: we are not able to ensure asymptotic convergence to the global optimum.

However, we have a numerical certificate of global optimality just as in the case of full relaxations:

The moment matrix of the relaxed variables is of rank one

Solution of partially relaxed problem is the global optimum

Full relaxation vs. partial relaxtion

Application: Triangulation, 3 cameras

Goal: find the optimal 3D point. Camera matrices are known, measured point is assumed to be in the origin of each view.

Camera matrices:

Summary• Geometric Vision Problems to POPs

– Triangulation and reconstruction problem

• Relaxations of POPs– Sum Of Squares (SOS) relaxation

• Guarantees bound on optimal solution

• Usually solution is optimal

– Linear Matrix Inequalities (LMI) relaxation• First order LMI relaxation: lifting, dropping rank constraint

• Higher order LMI relaxation: linear constraints to LMIs

• Guarantee of convergence, reference to Lassere

• Certificate of global optimality

• Application in vision• Finding optimal structure • Partial relaxation and Schur’s complement• Triangulation problem, benefit of partial relaxations

References F. Kahl and D. Henrion. Globally Optimal Estimates for

Geometric Reconstruction Problems. Accepted IJCV H. Waki, S. Kim, M. Kojima, and M. Muramatsu. Sums of

squares and semidefinite programming relaxations for polynomial optimization problems with structured sparsity. SIAM J. Optimization, 17(1):218–242, 2006.

J. B. Lasserre. Global optimization with polynomials and the problem of moments. SIAM J. Optimization, 11:796–817, 2001.

S. Boyd and L. Vandenberghe. Convex Optimization. Cambridge University Press, 2004.

R. I. Hartley and A. Zisserman. Multiple View Geometry in Computer Vision. Cambridge University Press, 2004. Second Edition.