Unconstrained optimization Gradient based algorithms –Steepest descent –Conjugate gradients...

12
Unconstrained optimization Gradient based algorithms Steepest descent Conjugate gradients Newton and quasi-Newton Population based algorithms Nelder Mead’s sequential simplex Stochastic algorithms

Transcript of Unconstrained optimization Gradient based algorithms –Steepest descent –Conjugate gradients...

  • Slide 1

Slide 2 Unconstrained optimization Gradient based algorithms Steepest descent Conjugate gradients Newton and quasi-Newton Population based algorithms Nelder Meads sequential simplex Stochastic algorithms Slide 3 Unconstrained local minimization The necessity for one dimensional searches The most intuitive choice of s k is the direction of steepest descent This choice, however is very poor Methods are based on the dictum that all functions of interest are locally quadratic Slide 4 Conjugate gradients What are the unlabeled axes? Slide 5 Newton and quasi-Newton methods Newton Quasi-Newton methods use successive evaluations of gradients to obtain approximation to Hessian or its inverse Earliest was DFP, currently best known is BFGS Like conjugate gradients guaranteed to converge in n steps or less for a quadratic function. Slide 6 Matlab fminfunc X=FMINUNC(FUN,X0,OPTIONS) minimizes with the default optimization parameters replaced by values in the structure OPTIONS, an argumentcreated with the OPTIMSET function. See OPTIMSET for details. Used options are Display, TolX, TolFun, DerivativeCheck, Diagnostics, FunValCheck GradObj, HessPattern, Hessian, HessMult, HessUpdate, InitialHessType, InitialHessMatrix, MaxFunEvals, MaxIter,DiffMinChange and DiffMaxChange, LargeScale, MaxPCGIter,PrecondBandWidth, TolPCG, TypicalX. Slide 7 Rosenbrock Banana function. Vanderplaatss version My version Slide 8 Matlab output [x,fval,exitflag,output] = fminunc(@banana,[-1.2, 1]) Warning: Gradient must be provided for trust-region algorithm; using line-search algorithm instead. Local minimum found. Optimization completed because the size of the gradient is less than the default value of the function tolerance. x =1.0000 1.0000 fval =2.8336e-011 exitflag =1 output = iterations: 36, funcCount: 138 algorithm: 'medium-scale: Quasi-Newton line search How would we reduce the number of iterations? Slide 9 Sequential Simplex Method (section 4.2.1) In n dimensional space start with n+1 particles at vertices of a regular (e.g., equilateral) simplex. Reflect worst point about c.g. Read about expansion and contraction Slide 10 Matlab commands function [y]=banana(x) global z1 global z2 global yg global count y=100*(x(2)-x(1)^2)^2+(1-x(1))^2; z1(count)=x(1); z2(count)=x(2); yg(count)=y; count=count+1; global z2 >> global yg >> global z1 >> global count >> count =1; >> options=optimset('MaxFunEvals',20) [x,fval] = fminsearch(@banana,[-1.2, 1],options) >> mat=[z1;z2;yg] mat = Columns 1 through 8 -1.200 -1.260 -1.200 -1.140 -1.080 -1.080 -1.020 -0.960 1.000 1.000 1.050 1.050 1.075 1.125 1.1875 1.150 24.20 39.64 20.05 10.81 5.16 4.498 6.244 9.058 Columns 9 through 16 -1.020 -1.020 -1.065 -1.125 -1.046 -1.031 -1.007 -1.013 1.125 1.175 1.100 1.100 1.119 1.094 1.078 1.113 4.796 5.892 4.381 7.259 4.245 4.218 4.441 4.813 Slide 11 fminsearch Banana function. -1.26-1.24-1.22-1.2-1.18-1.16-1.14-1.12-1.1-1.08-1.06 1 1.01 1.02 1.03 1.04 1.05 1.06 1.07 1.08 1.09 39.6 24,2 20.05 10.81 5.16 Slide 12 Next iteration Slide 13 Completed search [x,fval,exitflag,output] = fminsearch(@banana,[-1.2, 1]) x =1.0000 1.0000 fval =8.1777e-010 exitflag =1 output = iterations: 85 funcCount: 159 algorithm: 'Nelder-Mead simplex direct search Why is the number of iterations large compared to function evaluations (36 and 138 for fminunc)?