Evolutionary Computation Constraint Handlingszh/teaching/ec/Lecture11... · Evolutionary...

24
Evolutionary Computation Evolutionary Computation Constraint Handling Shan He School for Computational Science University of Birmingham Modules 02411 and 22313: EC

Transcript of Evolutionary Computation Constraint Handlingszh/teaching/ec/Lecture11... · Evolutionary...

Page 1: Evolutionary Computation Constraint Handlingszh/teaching/ec/Lecture11... · Evolutionary Computation Evolutionary Computation Constraint Handling Shan He School for Computational

Evolutionary Computation

Evolutionary ComputationConstraint Handling

Shan He

School for Computational ScienceUniversity of Birmingham

Modules 02411 and 22313: EC

Page 2: Evolutionary Computation Constraint Handlingszh/teaching/ec/Lecture11... · Evolutionary Computation Evolutionary Computation Constraint Handling Shan He School for Computational

Evolutionary Computation

What we have learned and what we will learned

What we have learned and what we will learned

I What we have learned:I Evolutionary Algorithms for optimisationI Binary Genetic AlgorithmI Continuous (real coded) GAI Selection schemesI Tutorial: Parameter estimation using GAI Swarm Intelligence and Particle Swarm Optimiser

I What will we learn this week:I Constraint handlingI Multi-objective optimisation

Page 3: Evolutionary Computation Constraint Handlingszh/teaching/ec/Lecture11... · Evolutionary Computation Evolutionary Computation Constraint Handling Shan He School for Computational

Evolutionary Computation

Outline

Outline of Topics

Constrained Optimisation

Constrained Optimisation TechniquesPenalty FunctionStochastic RankingRepair Approach

Page 4: Evolutionary Computation Constraint Handlingszh/teaching/ec/Lecture11... · Evolutionary Computation Evolutionary Computation Constraint Handling Shan He School for Computational

Evolutionary Computation

Constrained Optimisation

What Is Constrained Optimisation?

I The general problem we considered here can be described as:

minx{f (x)}

subject togi (x) ≤ 0, i = 1, · · · ,m

hj(x) = 0, j = 1, · · · , p

where x is the n dimensional vector, x = (x1, x2, · · · , xn); f (x)is the objective function; gi (x) is the inequality constraint;and hj(x) is the equality constraint.

I Denote the whole search space as S and the feasible space asF , F ∈ S. It is important to note that the global optimum inF might not be the same as that in S.

Page 5: Evolutionary Computation Constraint Handlingszh/teaching/ec/Lecture11... · Evolutionary Computation Evolutionary Computation Constraint Handling Shan He School for Computational

Evolutionary Computation

Constrained Optimisation

Different Types of Constraints

Constraints

Linear Non-linear

Equality Inequality Equality Inequality

I Linear constraints are relatively easy to deal with

I Non-linear constraints can be hard to deal with

Page 6: Evolutionary Computation Constraint Handlingszh/teaching/ec/Lecture11... · Evolutionary Computation Evolutionary Computation Constraint Handling Shan He School for Computational

Evolutionary Computation

Constrained Optimisation

Constrained Optimisation Example: Spring design

I Aims to minimize the weight of atension/compression spring. All designvariables are continuous.

I There are four constraints which relate tominimum deflection, shear stress, surgefrequency, and limits on outside diameterand design variables

I Let the wire diameter d = x1 , the meancoil diameter D = x2 and the number ofactive coils N = x3

d

D

free

leng

th disp

lace

men

t

Page 7: Evolutionary Computation Constraint Handlingszh/teaching/ec/Lecture11... · Evolutionary Computation Evolutionary Computation Constraint Handling Shan He School for Computational

Evolutionary Computation

Constrained Optimisation

Constrained Optimisation Example: Spring design

I Minimizef (X ) = (x3 + 2)x2x

21 (1)

subject to:

g1(X ) = 1− x32x371785x41

≤ 0 (2)

g2(X ) =4x22 − x1x2

12566(x2x31 − x41 )+

1

5108x21− 1 ≤ 0 (3)

g3(X ) = 1− 140.45x1x22x3

≤ 0 (4)

g4(X ) =x2 + x1

1.5− 1 ≤ 0 (5)

Page 8: Evolutionary Computation Constraint Handlingszh/teaching/ec/Lecture11... · Evolutionary Computation Evolutionary Computation Constraint Handling Shan He School for Computational

Evolutionary Computation

Constrained Optimisation Techniques

Constraint Handling Techniques

I The penalty function approach converts a constrainedproblem into an unconstrained one by introducing a penaltyfunction into the objective function.

I The repair approach maps (repairs) an infeasible solutioninto a feasible one.

I The purist approach rejects all infeasible solutions in search.The separatist approach considers the objective function andconstraints separately.

I The hybrid approach mixes two or more different constrainthandling techniques.

Page 9: Evolutionary Computation Constraint Handlingszh/teaching/ec/Lecture11... · Evolutionary Computation Evolutionary Computation Constraint Handling Shan He School for Computational

Evolutionary Computation

Constrained Optimisation Techniques

Penalty Function

Penalty Function Approach: Introduction

I New Objective Function = Original Objective Function +Penalty Coefficient * Degree Of Constraint Violation

I The general form of the exterior penalty function method:

f ′(x) = f (x) +

m∑i=1

riGi (x) +

p∑j=1

cjHj(x)

where f ′(x) is the new objective function to be minimised,f (x) is the original objective function, ri and cj are penaltyfactors (coefficients), and

Gi (x) = max((0, gi (x))β), Hj(x) = max(0, |hj(x)|γ)),

where β and γ are usually chosen as 1 or 2.

Page 10: Evolutionary Computation Constraint Handlingszh/teaching/ec/Lecture11... · Evolutionary Computation Evolutionary Computation Constraint Handling Shan He School for Computational

Evolutionary Computation

Constrained Optimisation Techniques

Penalty Function

Penalty Function Approach: Techniques

I Static Penalties: The penalty function is pre-defined andfixed during evolution.

I Dynamic Penalties: The penalty function changes accordingto a pre-defined sequence, which often depends on thegeneration number.

I Adaptive and Self-Adaptive Penalties: The penaltyfunction changes adaptively. There is no fixed sequence tofollow.

Page 11: Evolutionary Computation Constraint Handlingszh/teaching/ec/Lecture11... · Evolutionary Computation Evolutionary Computation Constraint Handling Shan He School for Computational

Evolutionary Computation

Constrained Optimisation Techniques

Penalty Function

Static Penalty Functions

I Static penalty functions general form:

f ′(x) = f (x) +m∑i=1

ri (Gi (x))2

where ri are predefined and fixed.I Equality constraints can be converted into inequality ones:

hj(x) =⇒ hj(x)− ε ≤ 0

where ε > 0 is a small number.I Simple and easy to implement.I Requires rich domain knowledge to set ris.I ri (i = 1, · · · ,m + p) can be divided into a number of

different levels. When to use which is determined by a set ofheuristic rules.

Page 12: Evolutionary Computation Constraint Handlingszh/teaching/ec/Lecture11... · Evolutionary Computation Evolutionary Computation Constraint Handling Shan He School for Computational

Evolutionary Computation

Constrained Optimisation Techniques

Penalty Function

Dynamic Penalty Functions

I Dynamic Penalties general form:

f ′(x) = f (x) + r(t)m∑i=1

G 2i (x) + c(t)

p∑j=1

H2j (x)

where r(t) and c(t) are two penalty coefficients.

I General principle for penalty coefficients: The large thegeneration number, the larger the penalty coefficient. Why?

I Polynomials: r(t) =∑N

k=1 ak−1tk−1, c(t) =

∑Nk=1 bk−1t

k−1

where ak and bk are user-defined parameters.

I Exponentials: r(t) = eat , c(t) = ebt where a and b areuser-defined parameters.

I Hybrid: r(t) = e∑N

k=1 bk−1tk−1

, c(t) = e∑N

k=1 bk−1tk−1

Page 13: Evolutionary Computation Constraint Handlingszh/teaching/ec/Lecture11... · Evolutionary Computation Evolutionary Computation Constraint Handling Shan He School for Computational

Evolutionary Computation

Constrained Optimisation Techniques

Penalty Function

Fitness Function and Selection

I Let Φ(x) = f (x) + rG (x), where G (x) =∑m

i=1 Gi (x) andGi (x) = max{0, gi (x)}

I Given two individual x1 and x2, their fitness values will bedetermined by Φ(x)

I Because fitness values are used primarily in selection:Changing fitness ⇐⇒ changing selection probabilities

I Φ(x1) < Φ(x2) means f (x1) + rG (x1) < f (x2) + rG (x2)I f (x1) ≤ f (x2) and G (x1) ≤ G (x2): r has no impact on the

comparisonI f (x1) < f (x2) and G (x1) > G (x2): Increasing r will eventually

change the comparisonI f (x1) > f (x2) and G (x1) < G (x2): Decreasing r will

eventually change the comparison

I In essence, different r lead to different ranking of individual inthe population

Page 14: Evolutionary Computation Constraint Handlingszh/teaching/ec/Lecture11... · Evolutionary Computation Evolutionary Computation Constraint Handling Shan He School for Computational

Evolutionary Computation

Constrained Optimisation Techniques

Penalty Function

Penalties and Fitness Landscape TransformationI Different penalty functions lead to different new objective

functions. Is transformation Φ2 a good choice?

Xin Yao 18'

&

$

%

What Would be an Appropriate Penalty

Is Φ2(x) a good choice? Why or why not?

Φ1(x)

Φ1(x)

Φ2(x)

Φ (x)3Φ2(x)

Φ (x)3

infeasible region

infeasibleregion

f(x)

x0

global feasible optimum

The balance between objective function and penalties is important.

Page 15: Evolutionary Computation Constraint Handlingszh/teaching/ec/Lecture11... · Evolutionary Computation Evolutionary Computation Constraint Handling Shan He School for Computational

Evolutionary Computation

Constrained Optimisation Techniques

Penalty Function

Penalty Functions Demystified

I Penalty function essentially performs:I Fitness transformationI Rank change (selection change)

I Why not change the rank directly in an EA?

Page 16: Evolutionary Computation Constraint Handlingszh/teaching/ec/Lecture11... · Evolutionary Computation Evolutionary Computation Constraint Handling Shan He School for Computational

Evolutionary Computation

Constrained Optimisation Techniques

Stochastic Ranking

Stochastic Ranking Self-adaptive

I Stochastic ranking is a special rank-based selection schemethat handles constraints. There is no need to use any penaltyfunctions. It’s self-adaptive since there are few parameters toset.

Stochastic Ranking Algorithm

DO the following until there is no more change in the ranking.FOR i := 1 TO M − 1 DO

u := U(0; 1), // u is a uniformly distributed random number;IF G(xi ) = G(xi+1) = 0 OR u ≤ Pf THEN

IF f (xi ) > f (xi+1) THEN swap(xi ; xi+1);ELSE

IF G(xi ) > G(xi+1) THEN swap(xi ; xi+1);

I M is the number of individuals and Pf indicates the probabilityof using the objective function for comparison in ranking.

Page 17: Evolutionary Computation Constraint Handlingszh/teaching/ec/Lecture11... · Evolutionary Computation Evolutionary Computation Constraint Handling Shan He School for Computational

Evolutionary Computation

Constrained Optimisation Techniques

Stochastic Ranking

The role of Pf

I Pf > 0.5: Most comparisons are based on f (x) only.Infeasible solutions are likely to occur.

I Pf < 0.5: Most comparisons are based on G (x) only.Infeasible solutions are less like to occur, but the solutionsmight be poor.

I In practice, Pf is often set between 0.45 and 0.5.

Page 18: Evolutionary Computation Constraint Handlingszh/teaching/ec/Lecture11... · Evolutionary Computation Evolutionary Computation Constraint Handling Shan He School for Computational

Evolutionary Computation

Constrained Optimisation Techniques

Repair Approach

Repair Approach to Constraint HandlingInstead of modifying an EA or fitness function, infeasibleindividuals can be repaired into feasible ones.

Xin Yao 19'

&

$

%

Repair Approach to Constraint Handling

Instead of modifying an EA or fitness function, infeasible

individuals can be “repaired” into feasible ones.

Ir

Is

feasible region

feasibleregion

feasible region

infeasible region

repaired individual

Page 19: Evolutionary Computation Constraint Handlingszh/teaching/ec/Lecture11... · Evolutionary Computation Evolutionary Computation Constraint Handling Shan He School for Computational

Evolutionary Computation

Constrained Optimisation Techniques

Repair Approach

Repairing Infeasible IndividualsLet Is be an infeasible individual and Ir a feasible individual.

Xin Yao 20'

&

$

%

Repairing Infeasible Individuals

Let Is be an infeasible individual and Ir a feasible individual.

Is

Ir

population of evolvingindividuals (feasible orinfeasible)

population of feasiblereference individuals(changing but not evolving)

Page 20: Evolutionary Computation Constraint Handlingszh/teaching/ec/Lecture11... · Evolutionary Computation Evolutionary Computation Constraint Handling Shan He School for Computational

Evolutionary Computation

Constrained Optimisation Techniques

Repair Approach

Repairing Algorithm

Stochastic Ranking Algorithm

Select a reference individual Ir .DO the following until individuals zi is feasible

Create a sequence of candidate individuals zi between Is and Ir :zi = ai Is + (1 − ai )Ir where 0 < ai < 1.

Let this zi be the repaired individual of Is .Its objective function value will be used as Is ’s fitness value.Replace Is by zi with probability Pr .Let Is ’s fitness value be the fitness value of zi even Is is not replaced by ziIf zi is better than Ir , replace it.

Page 21: Evolutionary Computation Constraint Handlingszh/teaching/ec/Lecture11... · Evolutionary Computation Evolutionary Computation Constraint Handling Shan He School for Computational

Evolutionary Computation

Constrained Optimisation Techniques

Repair Approach

Repairing Infeasible IndividualsLet Is be an infeasible individual and Ir a feasible individual.

Xin Yao 22'

&

$

%

Repairing Algorithm: Visualisation

Ir

Is

z2

z1

zi

feasible region

feasibleregion

feasible region

infeasible region

repaired individual

Page 22: Evolutionary Computation Constraint Handlingszh/teaching/ec/Lecture11... · Evolutionary Computation Evolutionary Computation Constraint Handling Shan He School for Computational

Evolutionary Computation

Constrained Optimisation Techniques

Repair Approach

Repairing Algorithm: Implementation Issues

I How to nd initial reference individuals?I Preliminary explorationI Human knowledge

I How to select IrI Uniformly at randomI According to the fitness of IrI According to the distance between Ir and Is

I How to determine aiI Uniformly at random between 0 and 1I Using a fixed sequence, e.g., 1

2 , 14 , · · ·

I How to choose Pr : A small number, usually ¡ 0.5

Page 23: Evolutionary Computation Constraint Handlingszh/teaching/ec/Lecture11... · Evolutionary Computation Evolutionary Computation Constraint Handling Shan He School for Computational

Evolutionary Computation

Constrained Optimisation Techniques

Repair Approach

Conclusion

I Adding a penalty term to the objective function is equivalentto changing the fitness function, which is in turn equivalent tochanging selection probabilities.

I It is easier and more effective to change the selectionprobabilities directly and explicitly. Stochastic ranking enablesus to do this.

I There are other constraint handling techniques than thepenalty method.

I We have covered numerical constraints only. We have notdealt with constraints in a combinatorial space.

Page 24: Evolutionary Computation Constraint Handlingszh/teaching/ec/Lecture11... · Evolutionary Computation Evolutionary Computation Constraint Handling Shan He School for Computational

Evolutionary Computation

Constrained Optimisation Techniques

Repair Approach

Further reading

I T. P. Runarsson, X. Yao, Stochastic ranking for constrainedevolutionary optimization, IEEE Transactions on EvolutionaryComputation, 4(3):284-294, September 2000.

I T. Runarsson and X. Yao, Search Bias in ConstrainedEvolutionary Optimization, IEEE Transactions on Systems,Man, and Cybernetics, Part C, 35(2):233-243, May 2005.

I S. He, E. Prempain and Q. H. Wu, An improved particleswarm optimizer for mechanical design optimization problems,Engineering Optimization, 36(5):585-605, 2004

I E. Mezura-Montesa, C. A. Coello Coellob, Constraint-handlingin nature-inspired numerical optimization: Past, present andfuture. Swarm and Evolutionary Computation, 2011