Chapter 4: Evolutionary Computation Implementations.

48
Chapter 4: Chapter 4: Evolutionary Evolutionary Computation Computation Implementations Implementations

Transcript of Chapter 4: Evolutionary Computation Implementations.

Page 1: Chapter 4: Evolutionary Computation Implementations.

Chapter 4: Evolutionary Chapter 4: Evolutionary Computation Computation

ImplementationsImplementations

Page 2: Chapter 4: Evolutionary Computation Implementations.

Evolutionary Computation Evolutionary Computation Implementations: Outline Implementations: Outline

•Genetic AlgorithmMainly a canonical versionCrossover: one-point, two-point, uniformSelection: Roulette wheel, tournament, rankingFive benchmark functions

•Particle Swarm OptimizationGlobal and local versionsMultiple swarm capability

Same benchmark functions as GA plus three for constraint satisfaction

Page 3: Chapter 4: Evolutionary Computation Implementations.
Page 4: Chapter 4: Evolutionary Computation Implementations.

EC Implementation Issues EC Implementation Issues (Generic)(Generic)

•Homogeneous vs. heterogeneous representation

•Online adaptation vs. offline adaptation

•Static adaptation versus adaptive adaptation

•Flowcharts versus finite state machines

Page 5: Chapter 4: Evolutionary Computation Implementations.

Homogeneous vs. Homogeneous vs. Heterogeneous Representation Heterogeneous Representation • Homogeneous representation

Used traditionallySimple; can use existing EC operatorsBinary is traditional coding for GAs; it’s simple and generalUse integer representation for discrete valued parametersUse real values to represent real valued parameters if possible

• Heterogeneous representationMost natural way to represent problemReal values represent real parameters, integers or binary strings represent discrete parametersComplexity of evolutionary operators increasesRepresentation-specific operators needed

Page 6: Chapter 4: Evolutionary Computation Implementations.

Binary RepresentationsBinary Representations

•AdvantagesSimple and popularUse standard operators

•DisadvantagesCan result in long chromosomesCan introduce inaccuracies

Page 7: Chapter 4: Evolutionary Computation Implementations.

Final Thoughts on Final Thoughts on Representation Representation

•The best representation is usually problem-dependent.

•Representation is often a major part of solving a problem.

•In general, represent a problem the way it appears in the system implementation.

Page 8: Chapter 4: Evolutionary Computation Implementations.

Population Adaptation Versus Population Adaptation Versus Individual Adaptation Individual Adaptation

•Individual: Most commonly used. Pittsburgh approach; each chromosome represents the entire problem. Performance of each candidate solution is proportional to the fitness of its representation.

•Population: Used when system can’t be evaluated offline. Michigan approach: entire population represents one solution. (Only one system evaluated each generation.) Cooperation and competition among all components of the system.

Page 9: Chapter 4: Evolutionary Computation Implementations.

Static Adaptation Versus Static Adaptation Versus Dynamic Adaptation Dynamic Adaptation

•Static: Most commonly used. Algorithms have fixed (or pre-determined) values.

•Adaptive: Can be done atEnvironment levelPopulation level (most common, if

done)Individual levelComponent level

Balance exploration and exploitation.

Page 10: Chapter 4: Evolutionary Computation Implementations.

Flowcharts Versus Finite State Machines

•Flowcharts: Easy to understand and use. Traditionally used; best for simpler systems

•Finite State Machine Diagrams: Used for systems with frequent user interaction, and for more complex systems. More suited to structured systems, and when multi-tasking is involved.

Page 11: Chapter 4: Evolutionary Computation Implementations.

Handling Multiple Similar Cases

•If two possibilities, use if-then

•If three or more, use switch (with cases); or function pointer (order is critical)

Page 12: Chapter 4: Evolutionary Computation Implementations.
Page 13: Chapter 4: Evolutionary Computation Implementations.

Allocating and Freeing Memory Space

•Arrays and vectors should be dynamically configured

•Allocate memory: calloc

•Release memory: free

Page 14: Chapter 4: Evolutionary Computation Implementations.

Error Checking

•Use frequently

•Use to debug

•Can use assert() [remove when program debugged]

Page 15: Chapter 4: Evolutionary Computation Implementations.

Genetic Algorithm Implementation

•Essentially a canonical GA that utilizes crossover and mutation

•Uses binary representation

•Searches for optima with real value parameters

•Several benchmark functions are included

Page 16: Chapter 4: Evolutionary Computation Implementations.
Page 17: Chapter 4: Evolutionary Computation Implementations.

Data TypesData Types

Enumeration data type used for selection types, crossover types, and to select the test function.

C has no data type for ‘bit’ so used unsigned character type for population. A bit (or a byte) can represent a bit; computational complexity issues must be addressed.

Page 18: Chapter 4: Evolutionary Computation Implementations.
Page 19: Chapter 4: Evolutionary Computation Implementations.
Page 20: Chapter 4: Evolutionary Computation Implementations.

The GA main() Routine The GA_Start_Up routine:

Reads in problem-related parameters such as the number of bits per parameter from the input file.

Allocates memoryInitializes population

The GA_Main_Loop runs the GA algorithm:EvaluationSelectionCrossoverMutation

The GA_Clean_Up:Stores results in an output fileDe-allocates memory

Page 21: Chapter 4: Evolutionary Computation Implementations.

GA Selection Mechanisms

•All use elitism

•Proportional selection – roulette wheel that uses fitness shifting and keeps fitnesses positive

•Binary tournament selection – better of two randomly-selected individuals

•Ranking selection – evenly-spaced fitness values; then like roulette wheel

In ga_selection() routine

Page 22: Chapter 4: Evolutionary Computation Implementations.

Mutate According to Bit Position Flag

When 0, bit-by-bit considerationWhen 1, mutation done that is approximation of Gaussian

Probability of mutation mb varies with bit position:

m m eb

b

021

2 where b=0 for the least significant bit, 1 for the next, etc.and m0 is the value in the run file.

Bit position is calculated for each variable. The mutation rate for the first bit is thus about .4 times the value in the run file.

(This mutation is similar to that carried out in EP and ES (Gaussian).

Page 23: Chapter 4: Evolutionary Computation Implementations.

Crossover Flag

0: One-point crossover

1: Uniform crossover

2: Two-point crossover

Page 24: Chapter 4: Evolutionary Computation Implementations.

result.dat1041500016200.750.0050.02021

result file namedimensionfunction type 0: F6 1: PARABOLIC 2: ROSENBROCK 3: RASTRIGRIN 4: GRIEWANKmaximum number of iterationsbits per parameterpopulation sizerate of crossoverrate of mutationtermination criterion (not used in this implementation, but must be present)mutation flag 0: base mutation 1: bit position mutationcrossover operator 0: one point; 1: uniform; 2: two pointselection operator 0: roulette; 1: binary tournament; 2: ranking;

GA.RUN

To run implementation:

C>\ga ga.run

Directory with ga.exe and run file

Page 25: Chapter 4: Evolutionary Computation Implementations.

Result file: part 1 of 2

resultFile ..........................resultfunction type .......................4input dim ...........................10max. No. generation..................15000bits for eachPara....................16boundary value.......................600.000000popu_size............................20individual length ...................160crossover rate ......................0.750000mutation rate ......................0.005000term. criterion .....................0.020000flag_m (1:bit position;0:cons) ......0c_type (0:one,1:unif,2:two)..........2selection type ......................1

generation: 15000best fitness: -0.067105variance: 22.179015

Page 26: Chapter 4: Evolutionary Computation Implementations.

fitness values:fit[ 0]: -0.067105fit[ 1]: -3.640442fit[ 2]: -0.423313fit[ 3]: -0.067105fit[ 4]: -0.067105fit[ 5]: -0.153248fit[ 6]: -1.761599fit[ 7]: -0.067105fit[ 8]: -3.241397fit[ 9]: -0.089210fit[10]: -0.935671fit[11]: -0.935671fit[12]: -1.987072fit[13]: -1.390572fit[14]: -0.279645fit[15]: -23.843609fit[16]: -1.497647fit[17]: -1.263834fit[18]: -90.743202fit[19]: -51.928169

parameters:para[ 0]: 3.140307para[ 1]: 4.440375para[ 2]: 5.410849para[ 3]: 0.009155para[ 4]: -7.003891para[ 5]: -0.009155para[ 6]: 8.194095para[ 7]: 0.009155para[ 8]: 9.365988para[ 9]: -0.009155begin time at: Mon Oct 01 08:35:14 2001

finish time at: Mon Oct 01 08:36:14 2001

Result file: part 1 of 2

Page 27: Chapter 4: Evolutionary Computation Implementations.

PSO Implementation PSO Implementation

•Basic PSO as previously described is implemented first

•A multi-swarm version (co-evolutionary PSO) is also implemented

•The implementation is based on a state machineArrows represent transitionsTransition labels indicate trigger for transition

•Can initialize symmetrically or asymmetrically

Page 28: Chapter 4: Evolutionary Computation Implementations.

PSO AttributesPSO Attributes

•Symmetrical or nonsymmetrical initialization

•Minimize or maximize

•Choice of five functions

•Inertia weight can be constant, linearly decreasing, or noisy

•Choose population size

•Specify number of dimensions (variables)

Page 29: Chapter 4: Evolutionary Computation Implementations.

PSO State MachinePSO State Machine

•Nine states

•A state handler performs action until state transition

•State machine runs until it reaches PSOS_DONE

Page 30: Chapter 4: Evolutionary Computation Implementations.

PSO State DiagramPSO State Diagram

Page 31: Chapter 4: Evolutionary Computation Implementations.

Definitions of States and Data Definitions of States and Data TypesTypes

Page 32: Chapter 4: Evolutionary Computation Implementations.

Definitions of States and Data Types, Definitions of States and Data Types, Cont’dCont’d..

Page 33: Chapter 4: Evolutionary Computation Implementations.

State Handling RoutinesState Handling Routines

•State handling routine called depends on current state

•The routine runs until its conditions are met, i.e., the maximum population index is reached

Page 34: Chapter 4: Evolutionary Computation Implementations.

PSO main()Routine

•Simple

•Startup: reads parameters, and allocates memory to dynamic variables

•Cleanup: stores results and de-allocates memory

Page 35: Chapter 4: Evolutionary Computation Implementations.

The Co-Evolutionary PSO

•Can use for problems with multiple constraints

•Uses augmented Lagrangian method to convert problem into min and max problems

One solves min problem with max problem as fixed environment

Other solves max problem with min problem as fixed environment

Page 36: Chapter 4: Evolutionary Computation Implementations.

Co-Evolutionary PSO Procedure

1. Initialize two PSOs

2. Run first PSO for max_gen_1 generations

3. If not first cycle, evaluate the pbest values for second PSO

4. Run second PSO for max_gen_2 generations

5. Re-evaluate pbest values for first PSO

6. Loop to 2) until termination criterion met

Page 37: Chapter 4: Evolutionary Computation Implementations.

Augmented LagrangianAugmented Lagrangian

s.constraint theare and

s,multiplier sLagrange' are

minimized, be ofunction t theis ,, where

,,,,

i

i

iii

G

zyxF

zyxGzyxFu

Page 38: Chapter 4: Evolutionary Computation Implementations.

Method of Lagrange Multiplier (Constraint Optimization)

Example Suppose a nuclear reactor is to have the shape of a cylinder of radius R and height H. Neutron diffusion theory tells that such reactor must have the following constraint.

                                                                           

We would like to minimize the volume of the reactor                                 

By using the equations above, then,                                                                                                                                       

By multiplying first equation by R/2 and the second by H, you should obtain                        

Page 39: Chapter 4: Evolutionary Computation Implementations.

Co-Evolutionary PSO Example 1st PSO: Population member is a vector of elements

(variables); run as minimization problem

2nd PSO: Population member is a vector of λ values [0,1]; run as maximization problem

Process:1. Run first PSO for max_gen_1 generations (e.g., 10); fitness

of particle is maximum obtained with any λ vector (λ values are fixed).

2. If not first cycle, re-calculate pbests for 2nd PSO3. Run second PSO for max_gen_2 generations; optimize with

respect to λ values in 2nd population; variable values are fixed.

4. Recalculate pbest values for first PSO.5. Increment cycle count and go to 1. if not max cycles

Page 40: Chapter 4: Evolutionary Computation Implementations.

Benchmark ProblemsBenchmark Problems

•For all benchmark problems, population sizes set to 40 and 30

•10 generations per PSO per cycle

•Different numbers of cycles tested: 40, 80, and 120

•In book, linearly decreasing inertia weight used

•50 runs (to max number of cycles) done for each combination of settings

Page 41: Chapter 4: Evolutionary Computation Implementations.

State Machine for Multi-PSO Version

typedef enum PSO_State_Tag{ PSO_UPDATE_INERTIA_WEIGHT, // Update inertia weight PSO_EVALUATE, // Evaluate particles PSO_UPDATE_GLOBAL_BEST, // Update global best PSO_UPDATE_LOCAL_BEST, // Update local best PSO_UPDATE_VELOCITY, // Update particle's velocity PSO_UPDATE_POSITION, // Update particle's position PSO_GOAL_REACH_JUDGE, // Judge whether reach the goal PSO_NEXT_GENERATION, // Move to the next generation PSO_UPDATE_PBEST_EACH_CYCLE,// Update pbest each cycle for //co-pso due to the //environment changed PSO_NEXT_PSO, // Move to the next PSO in the same cycle or // the first pso in the next cycle PSOS_DONE, // Finish one cycle of PSOs NUM_PSO_STATES // Total number of PSO states} PSO_State_Type;

Page 42: Chapter 4: Evolutionary Computation Implementations.

Multi-PSOs State DiagramMulti-PSOs State Diagram

Page 43: Chapter 4: Evolutionary Computation Implementations.

PSO-Evaluate for Multi-PSOs

For the co-evolutionary PSO, each PSO passes its function type to the evaluate_functions() routine to call its corresponding function to evaluate the PSO’s performance. For example, if the problem to be solved is the G7 problem, one PSO for solving the minimization problem calls G7_MIN(), and the other PSO for solving maximization problem will call G7_MAX().

Page 44: Chapter 4: Evolutionary Computation Implementations.

G1 ProblemG1 Problem

where

02

02

02

08

08

08

1022

1022

1022

tosubject

55555)x(

1298

1176

1054

123

112

101

121132

121031

111021

13

5

4

1

24321

xxx

xxx

xxx

xx

xx

xx

xxxx

xxxx

xxxx

xxxxxxfi

ii

i

13,10

;12,11,10,1000

;9,...,1,10

ix

ix

ix

i

i

i

The global minimum is known to be x*= (1,1,1,1,1,1,1,1,1,3,3,3,1)with f(x*) = -15

Page 45: Chapter 4: Evolutionary Computation Implementations.

For G1 ProblemFor G1 Problem

iiix

xxx

10 that and s,'for sconstraint Note

9. is #2 swarm of

dimension so , variables9 ,,

so s,constraint 9 are There

13 is #1 swarm of

dimension so , variables13,,

921

1321

For both swarms, the function that is evaluated is the augmented Lagrangian.

Page 46: Chapter 4: Evolutionary Computation Implementations.

Sample PSOS Run File, Part 1 2 # of PSOs1 update pbest each cycle flag300 total number of cycles to run0 optimization type (0 = min, 1 = max)0 function type (G1_min)1 inertia update method (1 = linearly decreasing)1 initialization (1 = asymmetric)0.0 left initialization 50.0 right initialization10 max velocity100 max position100 max generations per cycle30 population size13 dimensions0.9 initial inertia weight1 boundary flag (1 = enabled)0 1.0 lower and upper boundaries for parameters (13 for G1)0 1.0 0 1.0 0 1.0 0 1.0 0 1.0 0 1.0 0 1.0 0 1.0 0.0 100.00.0 100.00.0 100.00.0 1.0

Page 47: Chapter 4: Evolutionary Computation Implementations.

Sample PSOS Run File, Part 2 Values for second swarm, as in part 1

1 1 = max1 (G1_max)110.01.00.51702090.910.0 1.00.0 1.00.0 1.00.0 1.00.0 1.00.0 1.00.0 1.00.0 1.00.0 1.0

Page 48: Chapter 4: Evolutionary Computation Implementations.

Single PSO Run File (annotated)1 num of PSOs0 pso_update_pbest_each_cycle_flag (only for multiple swarms)40 total cycles of running PSOs

0 optimization type: 0=min or 1=max6 evaluation function (F6)1 inertia weight update method: 1=linear decreasing1 initialization type: 0=sym,1=asym-10.0 left initialization range50.0 right initialization range40 maximum velocity100 maximum position50 max number of generations per cycle

30 population size2 dimension0.9 initial inertia weight0 boundary flag 0=disabled 1=enabled

boundaries if boundary flag is 1

Evaluation functions 0: G1_MIN 1: G1_MAX 2: G7_MIN 3: G7_MAX 4: G9_MIN 5: G9_MAX 6: F6 7: SPHERE 8: ROSENBROCK 9: RASTRIGRIN 10: GRIEWANK