Optimization Part II
description
Transcript of Optimization Part II
Optimization Part II
G.Anuradha
Review of previous lecture-Steepest Descent
F x k 1+ F xk
Choose the next step so that the function decreases:
F xk 1+ F xk x k+ F xk gkT x k+=
For small changes in x we can approximate F(x):
g k F x x xk=
where
g kT x k kg k
Tpk 0=
If we want the function to decrease:
pk g– k=
We can maximize the decrease by choosing:
x k 1+ xk kg k–=
ExampleF x x1
2 2 x1x2 2x22 x1+ + +=
x00.50.5
=
F x x1 F x
x2 F x
2x1 2x2 1+ +
2x1 4x2+= = g0 F x
x x0=33
= =
0.1=
x1 x0 g0– 0.50.5
0.1 33
– 0.20.2
= = =
x2 x1 g1– 0.20.2
0.1 1.81.2
– 0.020.08
= = =
Plot
-2 -1 0 1 2-2
-1
0
1
2
Necessary and sufficient conditions for a function with single variable
Functions with two variablesNecessary conditions Sufficient conditions
Stationary Points
Effect of learning ratex k 1+ xk kg k–=
More the learning rate the trajectory becomes oscillatory.This will make the algorithm unstableThe upper limit for learning rates can be set for quadratic functions
Stable Learning Rates (Quadratic)F x 1
2---xTAx dTx c+ +=
F x Ax d+=
x k 1+ xk gk– x k Ax k d+ –= = xk 1+ I A– x k d–=
I A– zi z i Az i– z i iz i– 1 i– z i= = =
1 i– 1 2i----
2max------------
Stability is determinedby the eigenvalues of
this matrix.
Eigenvalues of [I - A].
Stability Requirement:
(i - eigenvalue of A)
ExampleA 2 2
2 4= 1 0.764= z1
0.8510.526–
=
2 5.24 z20.5260.851
=
=
2
max------------ 2
5.24---------- 0.38= =
-2 -1 0 1 2-2
-1
0
1
2
-2 -1 0 1 2-2
-1
0
1
2 0.37= 0.39=
Newton’s MethodF xk 1+ F xk xk+ F xk g k
Tx k
12---xk
TAkx k+ +=
gk Akxk+ 0=
Take the gradient of this second-order approximationand set it equal to zero to find the stationary point:
x k Ak1–– g k=
xk 1+ xk Ak1– gk–=
ExampleF x x1
2 2 x1x2 2x22 x1+ + +=
x00.50.5
=
F x x1 F x
x2 F x
2x1 2x2 1+ +
2x1 4x2+= =
g0 F x x x0=
33
= =
A 2 22 4
=
x10.50.5
2 22 4
1–33
– 0.50.5
1 0.5–0.5– 0.5
33
– 0.50.5
1.50
– 1–0.5
= = = =
Plot
-2 -1 0 1 2-2
-1
0
1
2
• This is used for finding line minimization methods and their stopping criteria– Initial bracketing– Line searches• Newton’s method• Secant method• Sectioning method
Initial Bracketing
• Helps in finding the range which contains the relative minimum
• Bracketing some assumed minimum in the starting interval is required
• Two schemes are used for this purpose
Sectioning methods