2008 - Multi-Objective Optimization Inspired by Nature (Slides)

download 2008 - Multi-Objective Optimization Inspired by Nature (Slides)

of 13

Transcript of 2008 - Multi-Objective Optimization Inspired by Nature (Slides)

  • 8/8/2019 2008 - Multi-Objective Optimization Inspired by Nature (Slides)

    1/13

    Projektbro KIT | 14. April 08

    Multi-objective OptimizationInspired by Nature

    Jrgen BrankeInstitute AIFB

    University of Karlsruhe, GermanyKarlsruhe Institute of Technology

    1 | Jrgen Branke | 12. April 2008

    Menu

    Appetizer

    Evolutionary algorithms

    Main course

    Using evolutionary algorithms instead of exact optimizers for MOPs

    Multi-objective evolutionary algorithmsIncluding preference information in MOEAs

    Desert

    Current research

    Summary

    2 | Jrgen Branke | 12. April 2008

    Evolutionary algorithms

    Darwins principle of natural evolution:survival of the fittest

    in populations of individuals (plants, animals), the better the individual is adapted to theenvironment, the higher its chance for survival and reproduction.

    evolving populationenvironment

    3 | Jrgen Branke | 12. April 2008

    Transfer to optimization

    Natural evolution

    individual

    environment

    fitness/how well adapted

    survival of the fittest

    mutation

    crossover

    population of

    individualsenvironment

    Evolutionary algorithms

    potential solution

    problem

    cost/quality of solution

    good solutions are kept

    small, random perturbations

    recombination of partial solutions

    population ofsolutions problem

    1324513542

    15342

    13254

    31542

  • 8/8/2019 2008 - Multi-Objective Optimization Inspired by Nature (Slides)

    2/13

    4 | Jrgen Branke | 12. April 2008

    INITIALIZE population(set of solutions)

    REPEAT

    UNTIL termination-condition

    EVALUATE Individualsaccording to goal ("fitness")

    SELECT parents

    RECOMBINE parents (CROSSOVER)

    MUTATE offspring

    EVALUATE offspring

    2 10

    1

    7

    27

    5

    12

    8

    14 9 17 4

    910

    17

    7

    47

    14

    12

    8

    FORM next population

    Basic algorithm

    5 | Jrgen Branke | 12. April 2008

    Advantages/Disadvantages

    +No restriction w.r.t. fitness function (e.g. does not have to be differentiable)

    +Universal applicability

    +Easy to integrate heuristic knowledge if available

    +Easy to parallelize

    +Easy to use (usually inexpensive to develop)+Anytime algorithms (available time is fully utilized)

    +Can deal with multiple objectives

    +User-interaction possible

    +Allow for continuous adaptation

    +Can work with stochastic fitness functions

    - Computationally expensive

    - No guaranteed solution quality- Parameter tuning necessary

    6 | Jrgen Branke | 12. April 2008

    Industrial applications

    Warehouse location problem (Locom)

    Process scheduling (Unilever)

    Job shop scheduling (Deer & Company, SAP, Volvo)

    Turbine design (Rolce Royce, Honda) Portfolio optimization (First Quadrant)

    Cleaning team assignment (Die Bahn)

    Chip design (Texas Instruments)

    Roboter movement (Honda)

    Nuclear fuel reloading (Siemens)

    Design of telephone networks (US West)

    Games (creatures)

    Military pilot training (British Air Force) Vehicle routing (Pina Petroli)

    Coating of fuorescent lamps (Philips)

    ....

    7 | Jrgen Branke | 12. April 2008

    Major design decisions

    Representation

    Genetic operators

    Selection mechanism

    Crossover/Mutation probability Population size

    Stopping criterion

  • 8/8/2019 2008 - Multi-Objective Optimization Inspired by Nature (Slides)

    3/13

    8 | Jrgen Branke | 12. April 2008

    Simple example: Travelling Salesman Problem

    Permutation encoding: 3-1-4-5-7-2-6-8-9

    Mutation: Exchange two cities

    Order crossover (OX)

    select partial sequence from one parent, fill up in order of other parent

    123456789 942836157

    4567

    928 31

    9 | Jrgen Branke | 12. April 2008

    Menu

    Appetizer

    Evolutionary algorithms

    Main course

    Using evolutionary algorithms instead of exact optimizers for MOPs

    Multi-objective evolutionary algorithmsIncluding preference information in MOEAs

    Desert

    Current research

    Summary

    10 | Jrgen Branke | 12. April 2008

    It is not always clear which

    solution is better

    Letfi, i=1dbe the different

    optimization criteria. Then,

    a solutionx is said to dominate a solution y ( )if and only if the following

    condition is fulfilled:

    Among a set of solutions P, the non-dominated set of solutions P are those

    that are not dominated by any member of the set P

    A solution which is not dominated by any other solution in the search space

    is called Pareto-optimal.

    User preferences are required

    AC

    B

    f1

    f2

    x f y fi (x) fi (y) i {1...d}j: fj(x) < fj(y)

    x f y

    Multiple objectives

    11 | Jrgen Branke | 12. April 2008

    A priori approach

    Multi-ObjectiveProblem

    Solution

    Single-ObjectiveProblem

    Specify preferences

    Optimize

    adjust

    preference

    informationevolutionary

    algorithm

  • 8/8/2019 2008 - Multi-Objective Optimization Inspired by Nature (Slides)

    4/13

    12 | Jrgen Branke | 12. April 2008

    Specifying preferences

    Difficult!

    Example: Tell me which travel plan you prefer!

    Days Paris

    Da y s Lo nd o n

    Days Paris

    Da y s Lo nd o n

    13 | Jrgen Branke | 12. April 2008

    Specifying preferences

    w2

    w1

    w2

    w1

    f2

    f1

    Linear weighting

    f2

    f1

    f2

    f1

    f2

    f1

    f2

    f1

    f2

    f1

    f2

    f1

    f2

    f1

    Constraints

    f2

    f1

    f2

    f1

    f2

    f1

    f2

    f1

    f2

    f1

    f2

    f1

    f2

    f1

    Reference point and

    achievement scalarizingfunction

    14 | Jrgen Branke | 12. April 2008

    Advantages / Disadvantages of EAs

    + Allows to solve problems where no exact methods exist

    Metaheuristics do not guarantee (Pareto-) optimality Solutions generated in subsequent iterations may dominate each other

    Adjusting preference information may lead to unexpected results

    Computationally expensive

    15 | Jrgen Branke | 12. April 2008

    A posteriori - The power of populations

    Multi-ObjectiveProblem

    Selected

    Solution

    Single-ObjectiveProblem

    Specify preferences

    Specify preferencesPareto Front

    Optimize

    Optimize

    a priori

    a posteriorimost EMO approaches

    Multi-objective

    evolutionaryalgorit

    hm

  • 8/8/2019 2008 - Multi-Objective Optimization Inspired by Nature (Slides)

    5/13

    16 | Jrgen Branke | 12. April 2008

    Multiobjective Evolutionary Algorithms (MOEAs)

    Since EAs work with a population of solutions, they can search for all

    (a representative subset of) Pareto-optimal solutions in one run

    Single EMO run is usually much more effective than multiple runs with

    different objectives

    17 | Jrgen Branke | 12. April 2008

    Demo

    18 | Jrgen Branke | 12. April 2008

    Basic structure

    INITIALIZE population

    (set of solutions)

    REPEAT

    UNTIL termination-condition

    EVALUATE Individualsaccording to goal ("fitness")

    SELECT parents

    RECOMBINE parents (CROSSOVER)

    MUTATE offspring

    EVALUATE offspring

    2 10

    1

    7

    27

    5

    12

    8

    14 9 17 4

    9 1017

    7

    47

    14

    12

    8

    FORM next population

    19 | Jrgen Branke | 12. April 2008

    What is a good approximation?

    f2

    f1

    equally distributedwide spread

    f2

    f1

    close to the Pareto front

  • 8/8/2019 2008 - Multi-Objective Optimization Inspired by Nature (Slides)

    6/13

    20 | Jrgen Branke | 12. April 2008

    Non-dominated Sorting GA (NSGA-II) [Deb et al. 2002]

    Based on two ideas:

    1. Pareto ranking: based on Pareto-dominance

    2. Crowding distance: mechanism to maintain diversity in the population

    Other popular approach:

    Strength Pareto Evolutionary Algorithm (SPEA) by Zitzler

    21 | Jrgen Branke | 12. April 2008

    Pareto ranking + crowding distance

    1

    3

    2

    Objective 1

    Objective2

    4

    22 | Jrgen Branke | 12. April 2008

    Front 1

    Front 2

    Front 3

    Front k

    Non-dominated

    sorting

    old

    pop

    off-

    spring

    Diversity

    sorting

    reject

    Form new

    population

    reject

    NSGA-II: Overall algorithm

    23 | Jrgen Branke | 12. April 2008

    Advantages of finding the complete front

    +Not necessary to specify preferences a priori

    +Allows DM to choose solution after having seen the alternatives

    + Interactive search of Pareto front

    Optimization prior to interaction, thus interaction very fast

    Only non-dominated solutions are presented to the user

    Direct navigation by user is possible

    Additional information on distribution of solutions along the front may be

    provided to the user (nadir point, ideal point, ...)

  • 8/8/2019 2008 - Multi-Objective Optimization Inspired by Nature (Slides)

    7/13

    24 | Jrgen Branke | 12. April 2008

    Advantages of finding the complete front

    +Not necessary to specify preferences a priori

    +Allows DM to choose solution after having seen the alternatives

    + Interactive search of Pareto front

    +Offer different alternatives to different customers

    (e.g., mean-variance portfolio optimization)

    +Reveal common properties among Pareto-optimal solutions

    (some variables are always the same)

    +Understand the causes for the trade-off

    25 | Jrgen Branke | 12. April 2008

    Understanding trade-offs

    26 | Jrgen Branke | 12. April 2008

    Advantages of finding the complete front

    +Allows DM to choose solution after having seen the alternatives

    + Interactive search of Pareto front

    +Offer different alternatives to different customers (e.g., mean-varianceportfolio optimization)

    +Reveal common properties among Pareto-optimal solutions

    (some variables are always the same)

    +Understand the causes for the trade-off

    +Aid in other optimization tasks (constraints, multi-objectivization)

    27 | Jrgen Branke | 12. April 2008

    Do we really need the whole front?

    Computational overhead

    Large set of alternatives, difficult to search by DM

    Identify most interesting regions

    Take into account partial user preferences

    Bias the distribution

    Restrict the distribution

  • 8/8/2019 2008 - Multi-Objective Optimization Inspired by Nature (Slides)

    8/13

    28 | Jrgen Branke | 12. April 2008

    Identifying knees

    Solutions where an improvement in either

    objective leads to a significant worsening of

    the other objective are more likely to be

    preferred -> knee Knee

    29 | Jrgen Branke | 12. April 2008

    Marginal Utilities [Branke et al. 2004]

    Assume user has linear utility function

    Evaluate each solution with expected loss of utility if solution would not be

    there

    martinal utility

    30 | Jrgen Branke | 12. April 2008

    If crowding distance is replaced by marginal utilities,

    algorithm focuses on knees

    No preference information necessary

    31 | Jrgen Branke | 12. April 2008

    Menu

    Appetizer

    Evolutionary algorithms

    Main course

    Using evolutionary algorithms instead of exact optimizers for MOPs

    Multi-objective evolutionary algorithmsIncluding preference information in MOEAs

    Desert

    Current research

    Summary

  • 8/8/2019 2008 - Multi-Objective Optimization Inspired by Nature (Slides)

    9/13

    32 | Jrgen Branke | 12. April 2008

    Motivation

    Although a user generally cannot specify his/her preferences exactly before

    alternatives are known, he/she usually has some rough idea as to what

    solutions are desired A solution should have at least x in objective f1.

    f1 of x would be good, f1 of y would be great.

    My target solution would look something like this.

    If a solution is worse by one unit in objective f1, it should be at least x units

    better in objective f2 to be interesting.

    Objective f1 is somewhat more important than objective f2.

    Hope: Find a larger variety of more interesting solutions more quickly.

    33 | Jrgen Branke | 12. April 2008

    Considering partial preferences

    Multi-ObjectiveProblem

    Selected

    Solution

    Single-ObjectiveProblem

    Specify preferences

    Specify preferencesPareto Front

    Optimize

    Optimize

    partial

    preferences

    +EMO

    Biased Pareto Front

    exactpreferences

    a priori

    a posteriorimost EMO approaches

    34 | Jrgen Branke | 12. April 2008

    EMO doesnt need user preferences.Does it?

    All EMO approaches attempt to find a representative set of the Pareto

    optimal front

    Usually, representative means well distributed

    But: distribution depends on scaling of the objectives

    0

    1

    10

    35 | Jrgen Branke | 12. April 2008

    ... at least x in objective f1.

    Constraint: f1>x

    Constraints are easy to integrate into EMO

    Lexicographic ordering (feasible solution always dominates infeasible solution)

    Penalty

    Additional objective

    ...

  • 8/8/2019 2008 - Multi-Objective Optimization Inspired by Nature (Slides)

    10/13

    36 | Jrgen Branke | 12. April 2008

    ... target solution ...

    1. Minimize distance to ideal solution (single objective)

    2. Minimize maximal distance in any objective (single objective)

    3. Goal Attainment [Fonseca & Fleming, 1993]/Goal Programming [Deb, 1999]

    Do not reward improvement over ideal solutionf1 -> max{0, f1-f1*}

    T1

    T2

    37 | Jrgen Branke | 12. April 2008

    ... at least x units better in objective f2 ...

    Maximal and minimal trade-offs

    Guided MOEA [Branke et al. 2001]

    Modify definition of dominance

    A

    B

    C

    Can be achieved by a simple transformation of the objectives Not so easy for more than

    2 objectives

    1(x) = f1(x)+w12 f2(x)

    2(x) = f2(x)+w21 f1(x)

    x f y i(x) i (y)i {1,2}j:j(x) < j(y)

    38 | Jrgen Branke | 12. April 2008

    B

    A

    4

    3

    21

    f1

    f2

    Guided dominance criterion

    39 | Jrgen Branke | 12. April 2008

    Faster convergence and better coverage of the interesting area of the

    Pareto-optimal front

    standard MOEA guided MOEA

    The effect of guidance

  • 8/8/2019 2008 - Multi-Objective Optimization Inspired by Nature (Slides)

    11/13

    40 | Jrgen Branke | 12. April 2008

    Marginal utility with preferences

    With non-uniform distribution of utility functions

    41 | Jrgen Branke | 12. April 2008

    Biased sharing [Branke & Deb 2004]

    User specifies weights and spread parameter

    Crowding distance calculation is modified

    42 | Jrgen Branke | 12. April 2008

    43 | Jrgen Branke | 12. April 2008

    Light Beam Search [Deb&Kumar 2007]

    Specify aspiration and reservation point

    Determine projection on Pareto front

    Identify interesting local area using outranking approaches

  • 8/8/2019 2008 - Multi-Objective Optimization Inspired by Nature (Slides)

    12/13

    44 | Jrgen Branke | 12. April 2008

    Interactive MOEA

    Narrow down / refocus search during MOEA run

    Explicitly by

    adjusting constraints

    moving the target

    modifying the max/min trade-offs

    ...

    Implicitly by comparing solutions

    learn user preferences

    45 | Jrgen Branke | 12. April 2008

    46 | Jrgen Branke | 12. April 2008

    Menu

    Appetizer

    Evolutionary algorithms

    Main course

    Using evolutionary algorithms instead of exact optimizers for MOPsMulti-objective evolutionary algorithms

    Including preference information in MOEAs

    Desert

    Current research

    Summary

    47 | Jrgen Branke | 12. April 2008

    Current research

    Many-objective problems (difficulty: almost all solutions non-dominated)

    Multiobjectivization (influence diversity and search space structure)

    Noisy objective functions (e.g., stochastic simulation)

    Worst-case multi-objective optimization

    Using metamodels in case of expensive fitness evaluation

    Individual = Set of solutions

  • 8/8/2019 2008 - Multi-Objective Optimization Inspired by Nature (Slides)

    13/13

    48 | Jrgen Branke | 12. April 2008

    Summary

    Evolutionary algorithms open new possibilities in multi-objective

    optimization because

    they are very general problem solvers

    they work on a population of solutions and can thus search for a whole set ofsolutions simultaneously

    Different ways to use EAs in MOO:

    1. As single-objective optimizer in classical MOO techniques2. To generate an approximation to the whole Pareto front3. With partial user preferences resulting in a partial front or biased distribution4. Interactively guided by the user

    49 | Jrgen Branke | 12. April 2008

    EMO resources

    Books:

    K. Deb: Multi-objective optimization using evolutionary algorithms. Wiley,

    2001

    C. Coello Coello, D. A. Van Veldhuizen and G. B. Lamont: Evolutionary

    algorithms for solving multi-objective problems. Kluwer, 2002 J. Branke, K. Deb, K. Miettinen, R. Slowinski: Multi-objective optimization -

    interactive and evolutionary approaches. Springer, to appear

    Websites:

    http://www.lania.mx/~coello

    50 | Jrgen Branke | 12. April 2008

    References

    [Branke et al. 2001] J. Branke, T. Kauler, H. Schmeck: Guidance in evolutionarymulti-objective optimization. Advances in Engineering Software, 32:499-507

    [Branke et al. 2004] J. Branke, K. Deb, H. Dierolf, M. Osswald: Finding knees in multi-

    objective optimization. Parallel Problem Solving from Nature, Springer, pp. 722-731

    Branke and Deb Biased[Branke and Deb 2004] J. Branke and K. Deb: Integrating user preferences into

    evolutionary multi-objective optimization. In Y. Jin, editor, Knowledge Incorporationin Evolutionary Computation, Springer, pages 461477

    [Deb 1999]: Solving goal programming problems using multi-objective geneticalgorithms. Congress on Evolutionary Computation, IEEE, pp. 77-84

    [Deb et al. 2002] K. Deb, S. Agrawal, A. Pratap, T. Meyarivan: A fast and Elitist multi-objective Genetic Algorithm: NSGA-II. IEEE Transactions on Evolutionary

    Computation 6(2):182-197[Deb and Kumar 2007] K. Deb and A. Kumar: Light beam search based multi-

    objective optimization using evolutionary algorithms. Congress on EvolutionaryComputation, IEEE, pp. 2125-2132

    [Fonseca and Fleming 1993] C. M. Fonseca and P. J. Fleming: Genetic algorithms formultiobjective optimization: Formulation, discussion, and generatization.International Conference on Genetic Algorithms, Morgan Kaufmann, pp. 416-423