Evolution Programs (insert catchy subtitle here).

29
Evolution Programs (insert catchy subtitle here)

Transcript of Evolution Programs (insert catchy subtitle here).

Page 1: Evolution Programs (insert catchy subtitle here).

Evolution Programs

(insert catchy subtitle here)

Page 2: Evolution Programs (insert catchy subtitle here).

Evolution Programs

• Basic idea: use principles of evolution and heredity to “evolve” solutions to a problem

Page 3: Evolution Programs (insert catchy subtitle here).

Rabbits

– At any given time, there is a population of rabbits. Some are smarter/faster than the others.

– Foxes eat most of the slower, dumber rabbits (but not all).

– More of the fast, smart rabbits reproduce.– “Wild hare” mutations possible (think Monty

Python here).– The next generation will be, on average, faster

and smarter than the original population.

Page 4: Evolution Programs (insert catchy subtitle here).

How do we apply this?

• This is NOT DNA-based computing as practiced in this department.

• Ideas?– Battlebots? (but how do robots breed?)– Blade Runner?

Page 5: Evolution Programs (insert catchy subtitle here).

Outline of evolution program• Maintain a population of individuals P(t)

– Each individual represents a potential solution to the problem at hand

• Evaluate each solution for fitness.• Form a new population P(t+1)

– Select the most fit individuals– Some members undergo transformations by

means of “genetic operators” to form new solutions

Page 6: Evolution Programs (insert catchy subtitle here).

Basic evolution program

t <- 0

Initialize P(t)

Evaluate P(t)

while not (termination condition) do

t <- t+1

select P(t) from P(t-1)

recombine P(t)

evaluate P(t)

end

Page 7: Evolution Programs (insert catchy subtitle here).

Genetic Algorithms

• Subset of evolution programs

• Solutions represented as bit strings

• Each solution is a chromosome

• Each bit is a gene

Page 8: Evolution Programs (insert catchy subtitle here).

Genetic Algorithms

• Components:– Genetic representation for potential solutions to

the problem– Method for creating initial population– Evaluation function to rate relative “fitness”– Genetic operators that alter composition of

solutions during reproduction– Values for parameters that algorithm uses

(population size, probability of applying operators)

Page 9: Evolution Programs (insert catchy subtitle here).

Genetic Operators

• Mutation– Arbitrary alteration of one or more genes of

a selected chromosome by random change with probability equal to the mutation rate

– Intuition: introduce some extra variability into the population

Page 10: Evolution Programs (insert catchy subtitle here).

Genetic Operators

• Crossover– Combine the features of two parent

chromosomes to form two similar offspring by swapping corresponding segments of the parents

– Intuition: information exchange between different potential solutions

Page 11: Evolution Programs (insert catchy subtitle here).

Example: Function optimization• Problem: Maximize the function

– f(x) = x sin (10 x) + 1.0

over the domain [-1..2].

• Desired accuracy is six decimal places

Page 12: Evolution Programs (insert catchy subtitle here).

Example: Function optimization• Representation

– Use a binary vector as a chromosome to represent real values of x.

– Domain of length 3, so we need 3x106

values.– Can be represented as a 22-bit vector

Page 13: Evolution Programs (insert catchy subtitle here).

Example: Function optimization• Initial population

– For a population of size n, create n 22-bit chromosomes with randomly initialized bit values.

• Evaluation function– eval(v) = f(x)

Page 14: Evolution Programs (insert catchy subtitle here).

Example: Function optimization• Genetic operators

– Use classical operators: mutation and crossover.

– Mutation• flip a bit with probability equal to mutation rate

– Crossover• randomly select crossover point• A crossover after the 5th bit of 00101|100 and

11010|011 yields two children– 00101|011 and 11010|100

Page 15: Evolution Programs (insert catchy subtitle here).

Example: Function optimization• Parameters

– population size = 50– probability of crossover = 0.25– probability of mutation = 0.01

Page 16: Evolution Programs (insert catchy subtitle here).

Function optimization: ResultsGeneration Number Evaluation Function

1 1.441942

6 2.250003

8 2.250283

9 2.250284

10 2.250363

12 2.329077

39 2.344251

40 2.345087

51 2.738930

99 2.849246

137 2.850217

145 2.850227

Page 17: Evolution Programs (insert catchy subtitle here).

Example: Prisoner’s dilemma• Problem:

– Two prisoners are held in separate cells, unable to communicate with each other.

– Each can choose either to defect and betray the other prisoner, or cooperate with the other prisoner by maintaining silence.

– Rewards• If both cooperate, moderate rewards for both• If only one defects, defector is rewarded, cooperator is

punished• If both defect, both are tortured

Page 18: Evolution Programs (insert catchy subtitle here).

Reward structure for PD

Player 1 Player 2 R1 R2 Comment

Defect Defect 1 1 Punishment for mutualdefection

Defect Cooperate 5 0 Temptation for defectand sucker’s payoff

Cooperate Defect 0 5 Sucker’s payoff, andtemptation to defect

Cooperate Cooperate 3 3 Reward for mutualcooperation

Page 19: Evolution Programs (insert catchy subtitle here).

Representation for PD

• Population of “players”, each of whom has a particular strategy

• Initial strategy chosen at random

• Players play games against each other, scores are recorded

Page 20: Evolution Programs (insert catchy subtitle here).

Representing the strategy for PD• Deterministic strategy• Use outcomes from 3 previous moves to

determine next move• 4 outcomes /move, 3 moves -> 64 possible

histories• 64-bit string to represent choice for each

possible history• Plus 6 bits to encode the three moves

preceding the start of the game

Page 21: Evolution Programs (insert catchy subtitle here).

Outline of algorithm

• Randomly initialize population

• Test each player by playing games. Score is average over all games played.

• Select players to breed.

• Mate players to produce offspring.

Page 22: Evolution Programs (insert catchy subtitle here).

Results for PD

• Random start -- produced populations whose median member is as successful as best known heuristic

• Patterns evolved:– Don’t rock the boat: (CC)(CC)(CC) -> C– Be provokable: (CC)(CC)(CD) -> D– Accept apology: (CD)(DC)(CC) -> C– Forget: (DC)(CC)(CC) -> C– Accept a rut: (DD)(DD)(DD) -> D

Page 23: Evolution Programs (insert catchy subtitle here).

A shot at TSP

• Traveling Salesman Problem– The salesperson must visit every city on

route exactly once and return to start. Minimize the cost of travel over entire tour

– Problem: how do we represent the problem? As a bit string? as an integer vector?

• If bit string, genetic operators may produce non-legal children

• Choose integer vector

Page 24: Evolution Programs (insert catchy subtitle here).

Traveling Salesperson Problem• Representation:

– Integer vector– Example: (3 6 4 8 2 1 5 7)

• Initialization:– random– output from greedy TSP

• Evaluation:– calculate cost of tour

Page 25: Evolution Programs (insert catchy subtitle here).

Traveling Salesperson Problem• Genetic operator

– Must preserve legality– Want to exploit similarities– Choose subsequence of one parent and

preserve relative order of cities from other parent• Parents

– (1 2 3 4 5 6 7 8 9 10 11 12)

– (7 3 1 11 4 12 5 2 10 9 6 8)

• Subsequence (4 5 6 7)• Offspring: (1 11 12 4 5 6 7 2 10 9 8 3)

Page 26: Evolution Programs (insert catchy subtitle here).

TSP: Results• Results averaged over 20 random runs

• Applied to 100 randomly generated cities

• After 20k generations, value of whole tour is 9.4% above optimal

Page 27: Evolution Programs (insert catchy subtitle here).

Characterization• Genetic algorithms/evolution programs can

be viewed as a form of search

• Like hill-climbing– But hill-climbing search get stuck in local maxima– Results depend on starting point

• GAs are probabilistic algorithms– But not random algorithms -- directed search

• Parallel search

Page 28: Evolution Programs (insert catchy subtitle here).

One wag’s comment

• “Neural networks are the second best way of doing just about anything. . .”

• “. . . and genetic algorithms are the third.”

• Genetic algorithms/evolution programs can be used on a wide variety of problems

Page 29: Evolution Programs (insert catchy subtitle here).

Cool looking stuff

• (see web pages)