Constrained optimization: direct methods...

18
Constrained optimization: direct methods (cont.) Jussi Hakanen Post-doctoral researcher [email protected] spring 2014 TIES483 Nonlinear optimization

Transcript of Constrained optimization: direct methods...

Page 1: Constrained optimization: direct methods (cont.)users.jyu.fi/~jhaka/opt/TIES483_constrained_direct2.pdf · Direct methods Also known as methods of feasible directions Idea –in a

Constrained optimization:

direct methods (cont.)

Jussi Hakanen

Post-doctoral researcher [email protected]

spring 2014 TIES483 Nonlinear optimization

Page 2: Constrained optimization: direct methods (cont.)users.jyu.fi/~jhaka/opt/TIES483_constrained_direct2.pdf · Direct methods Also known as methods of feasible directions Idea –in a

Direct methods

Also known as methods of feasible directions

Idea

– in a point 𝑥ℎ, generate a feasible search direction

where objection function value can be improved

– use line search to get 𝑥ℎ+1

Methods differ in

– how to choose a feasible direction and

– what is assumed from the constraints

(linear/nonlinear, equality/inequality)

spring 2014 TIES483 Nonlinear optimization

Page 3: Constrained optimization: direct methods (cont.)users.jyu.fi/~jhaka/opt/TIES483_constrained_direct2.pdf · Direct methods Also known as methods of feasible directions Idea –in a

Examples of direct methods

Projected gradient method

Active set method

Sequential Quadratic Programming (SQP)

method

spring 2014 TIES483 Nonlinear optimization

Page 4: Constrained optimization: direct methods (cont.)users.jyu.fi/~jhaka/opt/TIES483_constrained_direct2.pdf · Direct methods Also known as methods of feasible directions Idea –in a

Active set method

Consider a problem min 𝑓 𝑥 𝑠. 𝑡. 𝐴𝑥 ≤ 𝑏, where 𝐴 is an 𝑚 × 𝑛 matrix and 𝑏 ∈ 𝑅𝑚 – 𝑥∗ ∈ 𝑆 if and only if 𝐴𝑥∗ ≤ 𝑏

– 𝑖th constraint: 𝑎𝑖 𝑇𝑥 ≤ 𝑏𝑖

Idea – In 𝑥ℎ, the set of constraints is divided into active (𝑖 ∈ 𝐼)

and inactive constraints

– Inactive constraints are not taken into account when the search direction 𝑑ℎ is determined

– Inactive constraints affect only when computing the optimal step length 𝛼ℎ

spring 2014 TIES483 Nonlinear optimization

Page 5: Constrained optimization: direct methods (cont.)users.jyu.fi/~jhaka/opt/TIES483_constrained_direct2.pdf · Direct methods Also known as methods of feasible directions Idea –in a

Feasible directions

For 𝑖 ∈ 𝐼, 𝑎𝑖 𝑇𝑥ℎ = 𝑏𝑖

If 𝑑ℎ is feasible in 𝑥ℎ, then 𝑥ℎ + 𝛼𝑑ℎ ∈ 𝑆 for

some 𝛼 > 0 →

𝑎𝑖 𝑇𝑥ℎ + 𝛼𝑑ℎ = 𝑎𝑖 𝑇

𝑥ℎ + 𝛼 𝑎𝑖 𝑇𝑑ℎ ≤ 𝑏𝑖

𝑎𝑖 𝑇𝑑ℎ ≤ 0 for feasible 𝑑ℎ and the constraint

remains active if 𝑎𝑖 𝑇𝑑ℎ = 0

spring 2014 TIES483 Nonlinear optimization

Page 6: Constrained optimization: direct methods (cont.)users.jyu.fi/~jhaka/opt/TIES483_constrained_direct2.pdf · Direct methods Also known as methods of feasible directions Idea –in a

Optimality conditions

Assume that 𝑡 constraints are active in 𝑥ℎ (𝐼 = 𝑖1, … , 𝑖𝑡 ) and denote by 𝐴 the 𝑡 × 𝑛 matrix that consists of the active constraints and 𝑍 is the matrix spanning 𝑑 | 𝐴 𝑑 = 0

Necessary: If 𝑥∗ ∈ 𝑆 (𝐴 𝑥∗ = 𝑏 = (𝑏𝑖1 , … , 𝑏𝑖𝑡)) is a local minimizer, then

1) 𝑍 𝑇𝛻𝑓 𝑥∗ = 0 that is ∇𝑓 𝑥∗ + 𝐴 𝑇𝜇 ∗ = 0,

2) 𝜇𝑖∗ = 𝜇 𝑖

∗ > 0 for all 𝑖 ∈ 𝐼 and

3) matrix 𝑍 𝑇𝐻 𝑋∗ 𝑍 is positive semidefinite

Sufficient: If in 𝑥∗ ∈ 𝑆 1) 𝑍 𝑇𝛻𝑓 𝑥∗ = 0,

2) 𝜇𝑖∗ = 𝜇 𝑖

∗ > 0 for all 𝑖 ∈ 𝐼 and

3) matrix 𝑍 𝑇𝐻 𝑥∗ 𝑍 is positive definite,

then 𝑥∗ is a local minimizer

spring 2014 TIES483 Nonlinear optimization

Page 7: Constrained optimization: direct methods (cont.)users.jyu.fi/~jhaka/opt/TIES483_constrained_direct2.pdf · Direct methods Also known as methods of feasible directions Idea –in a

On active constraints

Optimization problem with inequality constraints is more difficult than problem with equality constraints since the active set in a local minimizer is not known

If it would be known, then it would be enough to solve a corresponding equality constrained problem

In that case, if the other constraints would be satisfied in the solution and all the Lagrange multipliers were non-negative, then the solution would also be a solution to the original problem

spring 2014 TIES483 Nonlinear optimization

Page 8: Constrained optimization: direct methods (cont.)users.jyu.fi/~jhaka/opt/TIES483_constrained_direct2.pdf · Direct methods Also known as methods of feasible directions Idea –in a

Using the active set

At each iteration, a working set is considered

which consists of the active constraints in 𝑥ℎ

The direction 𝑑ℎ is determined so that it is a

descent direction in the working set

– E.g. Rosen’s projected gradient method can be

used

spring 2014 TIES483 Nonlinear optimization

Page 9: Constrained optimization: direct methods (cont.)users.jyu.fi/~jhaka/opt/TIES483_constrained_direct2.pdf · Direct methods Also known as methods of feasible directions Idea –in a

Algorithm 1) Choose a starting point 𝑥1. Determine the active set 𝐼1 and the

corresponding matrices 𝐴 1 and 𝑍 1 in 𝑥1. Set ℎ = 1.

2) Compute a feasible direction 𝑑ℎ in the subspace defined by the active constraints. (E.g. by using projected gradient.)

3) If 𝑑ℎ = 0, go to 6). Otherwise, find 𝛼𝑚𝑎𝑥 = max 𝛼 | 𝑥ℎ + 𝛼𝑑ℎ ∈ 𝑆 and solve min𝑓 𝑥ℎ + 𝛼𝑑ℎ 𝑠. 𝑡. 0 ≤ 𝛼 ≤ 𝛼𝑚𝑎𝑥 . Let the solution be 𝛼ℎ. Set 𝑥ℎ+1 = 𝑥ℎ + 𝛼ℎ𝑑

ℎ.

4) If 𝛼ℎ < 𝛼𝑚𝑎𝑥, go to 7). (Active set has changed.)

5) Addition to active set: Add the constraint which first becomes active to 𝐼ℎ. Update 𝐴 ℎ and 𝑍 ℎ correspondingly → 𝐼ℎ+1, 𝐴 ℎ+1 and 𝑍 ℎ+1. Go to 7).

6) Removal from active set: Compute 𝜇𝑖 , 𝑖 ∈ 𝐼ℎ. If 𝜇𝑖 ≥ 0 for all 𝑖 ∈ 𝐼ℎ, 𝑥ℎ is optimal, stop. If 𝜇𝑖 < 0 for some 𝑖, remove the corresponding constraint from active set (the one with the smallest 𝜇𝑖). Update 𝐴 ℎ and 𝑍 ℎ correspondingly → 𝐼ℎ+1, 𝐴 ℎ+1 and 𝑍 ℎ+1. Set 𝑥ℎ+1 = 𝑥ℎ.

7) Set ℎ = ℎ + 1 and go to 2).

spring 2014 TIES483 Nonlinear optimization

Page 10: Constrained optimization: direct methods (cont.)users.jyu.fi/~jhaka/opt/TIES483_constrained_direct2.pdf · Direct methods Also known as methods of feasible directions Idea –in a

How to compute 𝜇

At each iteration, a problem

min𝑓(𝑥) 𝑠. 𝑡. 𝐴 𝑥 = 𝑏

is solved

Lagrangian function: 𝐿 𝑥, 𝜇 = 𝑓 𝑥 + 𝜇 𝑇 𝐴 𝑥 − 𝑏

In a minimizer 𝑥∗, 𝛻𝐿 𝑥∗, 𝜇 = 𝛻𝑓 𝑥∗ + 𝐴 𝑇𝜇 = 0 ⟹ 𝛻𝑓 𝑥∗ = −𝐴 𝑇𝜇

⟹ 𝐴 𝛻𝑓 𝑥∗ = −(𝐴 𝐴 𝑇)𝜇

⟹ 𝜇 = −(𝐴 𝐴 𝑇)−1𝐴 𝛻𝑓 𝑥∗

Used in 𝑥ℎ to approximate 𝜇

spring 2014 TIES483 Nonlinear optimization

Page 11: Constrained optimization: direct methods (cont.)users.jyu.fi/~jhaka/opt/TIES483_constrained_direct2.pdf · Direct methods Also known as methods of feasible directions Idea –in a

Example: Active set method +

Rosen’s projected gradient

spring 2014 TIES483 Nonlinear optimization

Fro

m M

iett

ine

n: N

on

line

ar

op

tim

iza

tio

n, 2

00

7 (

in F

inn

ish

)

Page 12: Constrained optimization: direct methods (cont.)users.jyu.fi/~jhaka/opt/TIES483_constrained_direct2.pdf · Direct methods Also known as methods of feasible directions Idea –in a

Example (cont.)

spring 2014 TIES483 Nonlinear optimization

Fro

m M

iett

ine

n: N

on

line

ar

op

tim

iza

tio

n, 2

00

7 (

in F

inn

ish

)

Page 13: Constrained optimization: direct methods (cont.)users.jyu.fi/~jhaka/opt/TIES483_constrained_direct2.pdf · Direct methods Also known as methods of feasible directions Idea –in a

Example (cont.)

spring 2014 TIES483 Nonlinear optimization

Fro

m M

iett

ine

n: N

on

line

ar

op

tim

iza

tio

n, 2

00

7 (

in F

inn

ish

)

Page 14: Constrained optimization: direct methods (cont.)users.jyu.fi/~jhaka/opt/TIES483_constrained_direct2.pdf · Direct methods Also known as methods of feasible directions Idea –in a

SQP method

Sequential Quadratic Programming

Idea is to generate a sequence of quadratic optimization problems whose solutions approach the solution of the original problem

quadratic problems are based on applying KKT conditions to the original problem – Minimize a quadratic approximation of the

Lagrangian function with respect to linear approximation of the constraints

– Also referred as projected Lagrangian method

spring 2014 TIES483 Nonlinear optimization

Page 15: Constrained optimization: direct methods (cont.)users.jyu.fi/~jhaka/opt/TIES483_constrained_direct2.pdf · Direct methods Also known as methods of feasible directions Idea –in a

Approximation of the constraints

Consider a problem

min𝑓 𝑥 𝑠. 𝑡. ℎ𝑖 𝑥 = 0, 𝑖 = 1,… , 𝑙,

where all the functions are twice continuously

differentiable

Taylor’s series (𝑑 = 𝑥∗ − 𝑥ℎ):

ℎ𝑖 𝑥ℎ + 𝑑 ≈ ℎ𝑖 𝑥ℎ + ∇ℎ𝑖 𝑥ℎ 𝑇𝑑

ℎ𝑖 𝑥∗ = 0 for all 𝑖

⟹ ∇ℎ 𝑥ℎ 𝑑 = −ℎ(𝑥ℎ)

spring 2014 TIES483 Nonlinear optimization

Page 16: Constrained optimization: direct methods (cont.)users.jyu.fi/~jhaka/opt/TIES483_constrained_direct2.pdf · Direct methods Also known as methods of feasible directions Idea –in a

Approximation of the Lagrangian

𝐿 𝑥ℎ + 𝑑, 𝜈∗ ≈

𝐿 𝑥ℎ, 𝜈∗ + 𝑑𝑇∇𝑥𝐿 𝑥ℎ, 𝜈∗ +1

2𝑑𝑇∇𝑥𝑥

2 𝐿 𝑥ℎ, 𝜈∗ 𝑑

A quadratic subproblem:

min𝑑 𝑑𝑇∇𝑓 𝑥ℎ +1

2𝑑𝑇𝐸 𝑥ℎ, 𝜈ℎ 𝑑

𝑠. 𝑡. 𝛻ℎ 𝑥ℎ 𝑑 = −ℎ(𝑥ℎ), where 𝐸(𝑥ℎ, 𝜈ℎ) is either the Hessian of the Lagrangian or some approximation of it – Approximate if the second derivatives are not available

It can be shown (under some assumptions), that solutions of the subproblems approach 𝑥∗ and Lagrange multipliers approach 𝜈∗

spring 2014 TIES483 Nonlinear optimization

Page 17: Constrained optimization: direct methods (cont.)users.jyu.fi/~jhaka/opt/TIES483_constrained_direct2.pdf · Direct methods Also known as methods of feasible directions Idea –in a

Algorithm

1) Choose a starting point 𝑥1. Compute 𝐸1and set ℎ =1.

2) Solve a quadratic subproblem

min𝑑

𝑑𝑇𝛻𝑓 𝑥ℎ +1

2𝑑𝑇𝐸ℎ𝑑 𝑠. 𝑡. 𝛻ℎ 𝑥ℎ 𝑑 = −ℎ(𝑥ℎ). Let

the solution be 𝑑ℎ.

3) Set 𝑥ℎ+1 = 𝑥ℎ + 𝑑ℎ. Stop if 𝑥ℎ+1 satisfies optimality

conditions.

4) Compute an estimate for 𝜈ℎ+1 and compute

𝐸ℎ+1 = 𝐸 𝑥ℎ+1, 𝜈ℎ+1 .

5) Set ℎ = ℎ + 1 and go to 2).

spring 2014 TIES483 Nonlinear optimization

Page 18: Constrained optimization: direct methods (cont.)users.jyu.fi/~jhaka/opt/TIES483_constrained_direct2.pdf · Direct methods Also known as methods of feasible directions Idea –in a

Notes Quadratic subproblems are typically not solved as optimization problems but are converted into a system of equations and solved with suitable methods:

𝐸ℎ𝑑 = 0, 𝛻ℎ 𝑥∗ 𝑑 = 0

If 𝐸 = ∇𝑥𝑥2 𝐿, then 𝑥ℎ, 𝜈ℎ → 𝑥∗, 𝜈∗ quadratically

If 𝐸 is an approximation, then convergence is superlinear

Only local convergence, that is, starting point must be ’close’ to optimum

Inequality constraints considered with active set

strategies or explicitly as ∇𝑔𝑖 𝑥ℎ 𝑇𝑑 ≤ −𝑔𝑖(𝑥

ℎ)

spring 2014 TIES483 Nonlinear optimization