Harmony Search for Multi-objective Optimization - SBRN 2012

Post on 14-Dec-2014

605 views 1 download

description

Slides used in the presentation of the article "Harmony Search for Multi-objective Optimization" in the 2012 Brazilian Symposium on Neural Networks (SBRN). Link to the article: http://ieeexplore.ieee.org/xpl/articleDetails.jsp?reload=true&arnumber=6374852

Transcript of Harmony Search for Multi-objective Optimization - SBRN 2012

Harmony Search for Multi-objective Optimization

Harmony Search for Multi-objectiveOptimization

Lucas M. PavelskiCarolina P. Almeida

Richard A. Goncalves

2012 Brazilian Symposium on Neural Networks — SBRN

October 25th, 2012

Pavelski, Almeida, Goncalves SBRN 2012 1 of 34

Harmony Search for Multi-objective Optimization

Summary

Introduction

BackgroundMulti-objective Optimization and MOEAsHarmony Search and Variants

Proposed Algorithms

Experimental Results

Conclusions

Pavelski, Almeida, Goncalves SBRN 2012 2 of 34

Harmony Search for Multi-objective Optimization

Summary

Introduction

BackgroundMulti-objective Optimization and MOEAsHarmony Search and Variants

Proposed Algorithms

Experimental Results

Conclusions

Pavelski, Almeida, Goncalves SBRN 2012 2 of 34

Harmony Search for Multi-objective Optimization

Summary

Introduction

BackgroundMulti-objective Optimization and MOEAsHarmony Search and Variants

Proposed Algorithms

Experimental Results

Conclusions

Pavelski, Almeida, Goncalves SBRN 2012 2 of 34

Harmony Search for Multi-objective Optimization

Summary

Introduction

BackgroundMulti-objective Optimization and MOEAsHarmony Search and Variants

Proposed Algorithms

Experimental Results

Conclusions

Pavelski, Almeida, Goncalves SBRN 2012 2 of 34

Harmony Search for Multi-objective Optimization

Summary

Introduction

BackgroundMulti-objective Optimization and MOEAsHarmony Search and Variants

Proposed Algorithms

Experimental Results

Conclusions

Pavelski, Almeida, Goncalves SBRN 2012 2 of 34

Harmony Search for Multi-objective OptimizationIntroduction

Introduction

BackgroundMulti-objective Optimization and MOEAsHarmony Search and Variants

Proposed Algorithms

Experimental Results

Conclusions

Pavelski, Almeida, Goncalves SBRN 2012 3 of 34

Harmony Search for Multi-objective OptimizationIntroduction

IntroductionI Multi-objective Optimization

I Extends Mono-objective OptimizationI Lack of extensive studying and comparison between

existing techniquesI Computationally expensive methods

I Harmony SearchI A recent, emergent metaheuristicI Little exploration of its operandsI Simple implementation

I Objectives:I Explore the Harmony Search in MO, using the

well-known NSGA-II frameworkI Test on 10 MO problems from CEC 2009I Evaluate results with statistical tests

Pavelski, Almeida, Goncalves SBRN 2012 4 of 34

Harmony Search for Multi-objective OptimizationIntroduction

IntroductionI Multi-objective Optimization

I Extends Mono-objective OptimizationI Lack of extensive studying and comparison between

existing techniquesI Computationally expensive methods

I Harmony SearchI A recent, emergent metaheuristicI Little exploration of its operandsI Simple implementation

I Objectives:I Explore the Harmony Search in MO, using the

well-known NSGA-II frameworkI Test on 10 MO problems from CEC 2009I Evaluate results with statistical tests

Pavelski, Almeida, Goncalves SBRN 2012 4 of 34

Harmony Search for Multi-objective OptimizationIntroduction

IntroductionI Multi-objective Optimization

I Extends Mono-objective OptimizationI Lack of extensive studying and comparison between

existing techniquesI Computationally expensive methods

I Harmony SearchI A recent, emergent metaheuristicI Little exploration of its operandsI Simple implementation

I Objectives:I Explore the Harmony Search in MO, using the

well-known NSGA-II frameworkI Test on 10 MO problems from CEC 2009I Evaluate results with statistical tests

Pavelski, Almeida, Goncalves SBRN 2012 4 of 34

Harmony Search for Multi-objective OptimizationBackground

Introduction

BackgroundMulti-objective Optimization and MOEAsHarmony Search and Variants

Proposed Algorithms

Experimental Results

Conclusions

Pavelski, Almeida, Goncalves SBRN 2012 5 of 34

Harmony Search for Multi-objective OptimizationBackground

MO and MOEAs

Multi-objective Optimization Problem

Mathematically [Deb 2011]:

Min/Max fm(x), m = 1, . . . ,Msubject to gj(x) ≥ 0, j = 1, . . . , J

hk(x) = 0, k = 1, . . . ,Kx (L)

i ≤ xi ≤ x (U)i i = 1, . . . , n

where f : Ω→ Y (⊆ <M)

Conflicting objectives Multiple optimal solutions

Pavelski, Almeida, Goncalves SBRN 2012 6 of 34

Harmony Search for Multi-objective OptimizationBackground

MO and MOEAs

Multi-objective Optimization Problem

Mathematically [Deb 2011]:

Min/Max fm(x), m = 1, . . . ,Msubject to gj(x) ≥ 0, j = 1, . . . , J

hk(x) = 0, k = 1, . . . ,Kx (L)

i ≤ xi ≤ x (U)i i = 1, . . . , n

where f : Ω→ Y (⊆ <M)

Conflicting objectives Multiple optimal solutions

Pavelski, Almeida, Goncalves SBRN 2012 6 of 34

Harmony Search for Multi-objective OptimizationBackground

MO and MOEAs

Pareto dominanceu ≺ v : ∀i ∈ 1, . . . ,M, ui ≥ vi and

∃i ∈ 1, . . . ,M : ui < vi [Coello, Lamont e Veldhuizen 2007]

Figure: Graphical representation of Pareto dominance[Zitzler 1999]

Pavelski, Almeida, Goncalves SBRN 2012 7 of 34

Harmony Search for Multi-objective OptimizationBackground

MO and MOEAs

Multi-Objective Evolutionary Algorithms (MOEAs)

Two main issues in Multi-objective Optimization:[Zitzler 1999]:I Approximate Pareto-optimal solutionsI Maintain diversity

Evolutionary Algorithms:I Maintain a population of solutionsI Explore the solution’s similarities Multi-objective Evolutionary Algorithms (MOEAs), like theNSGA-II

Pavelski, Almeida, Goncalves SBRN 2012 8 of 34

Harmony Search for Multi-objective OptimizationBackground

MO and MOEAs

Multi-Objective Evolutionary Algorithms (MOEAs)

Two main issues in Multi-objective Optimization:[Zitzler 1999]:I Approximate Pareto-optimal solutionsI Maintain diversity

Evolutionary Algorithms:I Maintain a population of solutionsI Explore the solution’s similarities Multi-objective Evolutionary Algorithms (MOEAs), like theNSGA-II

Pavelski, Almeida, Goncalves SBRN 2012 8 of 34

Harmony Search for Multi-objective OptimizationBackground

MO and MOEAs

Non-dominated Sorting Genetic Algorithm II(NSGA-II)

I Proposed in [Deb et al. 2000]I Successfully applied to many problemsI Non-dominated sorting to obtain close Pareto-optimal

optimal solutionsI Crowding distance to maintain the diversityI Genetic Algorithm operands to create new solutionsI Basic framework is used in the proposed algorithms

Pavelski, Almeida, Goncalves SBRN 2012 9 of 34

Harmony Search for Multi-objective OptimizationBackground

MO and MOEAs

Non-dominated Sorting Genetic Algorithm II(NSGA-II)

I Proposed in [Deb et al. 2000]I Successfully applied to many problemsI Non-dominated sorting to obtain close Pareto-optimal

optimal solutionsI Crowding distance to maintain the diversityI Genetic Algorithm operands to create new solutionsI Basic framework is used in the proposed algorithms

Pavelski, Almeida, Goncalves SBRN 2012 9 of 34

Harmony Search for Multi-objective OptimizationBackground

MO and MOEAs

Non-dominated Sorting Genetic Algorithm II(NSGA-II)

I Proposed in [Deb et al. 2000]I Successfully applied to many problemsI Non-dominated sorting to obtain close

Pareto-optimal optimal solutionsI Crowding distance to maintain the diversityI Genetic Algorithm operands to create new solutionsI Basic framework is used in the proposed algorithms

Pavelski, Almeida, Goncalves SBRN 2012 9 of 34

Harmony Search for Multi-objective OptimizationBackground

MO and MOEAs

Non-dominated Sorting Genetic Algorithm II(NSGA-II)

I Proposed in [Deb et al. 2000]I Successfully applied to many problemsI Non-dominated sorting to obtain close Pareto-optimal

optimal solutionsI Crowding distance to maintain the diversityI Genetic Algorithm operands to create new solutionsI Basic framework is used in the proposed algorithms

Pavelski, Almeida, Goncalves SBRN 2012 9 of 34

Harmony Search for Multi-objective OptimizationBackground

MO and MOEAs

Non-dominated Sorting Genetic Algorithm II(NSGA-II)

I Proposed in [Deb et al. 2000]I Successfully applied to many problemsI Non-dominated sorting to obtain close Pareto-optimal

optimal solutionsI Crowding distance to maintain the diversityI Genetic Algorithm operands to create new solutionsI Basic framework is used in the proposed algorithms

Pavelski, Almeida, Goncalves SBRN 2012 9 of 34

Harmony Search for Multi-objective OptimizationBackground

MO and MOEAs

Non-dominated Sorting Genetic Algorithm II(NSGA-II)

I Proposed in [Deb et al. 2000]I Successfully applied to many problemsI Non-dominated sorting to obtain close Pareto-optimal

optimal solutionsI Crowding distance to maintain the diversityI Genetic Algorithm operands to create new solutionsI Basic framework is used in the proposed algorithms

Pavelski, Almeida, Goncalves SBRN 2012 9 of 34

Harmony Search for Multi-objective OptimizationBackground

MO and MOEAs

Non-Dominated Sorting Genetic algorithm(NSGA-II) [Deb et al. 2000]

Figure: Non-dominatedSorting [Zitzler 1999]

Figure: Non-dominated Selection[Deb et al. 2000]

Pavelski, Almeida, Goncalves SBRN 2012 10 of 34

Harmony Search for Multi-objective OptimizationBackground

MO and MOEAs

Non-Dominated Sorting Genetic algorithm(NSGA-II) [Deb et al. 2000]

Figure: Non-dominatedSelection [Deb et al. 2000]

+Figure: Crowdingdistance[Deb et al. 2000]

Pavelski, Almeida, Goncalves SBRN 2012 11 of 34

Harmony Search for Multi-objective OptimizationBackground

HS and Variants

Harmony Search (HS) Overview

I New metaheuristic, proposed in[Geem, Kim e Loganathan 2001]

I Simplicity of implementation and customizationI Little exploration on MOI Inspired by jazz musicians: just like musical performers

seek an aesthetically good melody, by varying the set ofsounds played on each practice; the optimization seeksthe global optimum of a function, by evolving itscomponents variables on each iteration[Geem, Kim e Loganathan 2001].

Pavelski, Almeida, Goncalves SBRN 2012 12 of 34

Harmony Search for Multi-objective OptimizationBackground

HS and Variants

Harmony Search (HS) Overview

I New metaheuristic, proposed in[Geem, Kim e Loganathan 2001]

I Simplicity of implementation and customizationI Little exploration on MOI Inspired by jazz musicians: just like musical performers

seek an aesthetically good melody, by varying the set ofsounds played on each practice; the optimization seeksthe global optimum of a function, by evolving itscomponents variables on each iteration[Geem, Kim e Loganathan 2001].

Pavelski, Almeida, Goncalves SBRN 2012 12 of 34

Harmony Search for Multi-objective OptimizationBackground

HS and Variants

Harmony Search (HS) Overview

I New metaheuristic, proposed in[Geem, Kim e Loganathan 2001]

I Simplicity of implementation and customizationI Little exploration on MOI Inspired by jazz musicians: just like musical performers

seek an aesthetically good melody, by varying the set ofsounds played on each practice; the optimization seeksthe global optimum of a function, by evolving itscomponents variables on each iteration[Geem, Kim e Loganathan 2001].

Pavelski, Almeida, Goncalves SBRN 2012 12 of 34

Harmony Search for Multi-objective OptimizationBackground

HS and Variants

Harmony Search (HS) Overview

I New metaheuristic, proposed in[Geem, Kim e Loganathan 2001]

I Simplicity of implementation and customizationI Little exploration on MOI Inspired by jazz musicians: just like musical

performers seek an aesthetically good melody, byvarying the set of sounds played on each practice;the optimization seeks the global optimum of afunction, by evolving its components variables oneach iteration [Geem, Kim e Loganathan 2001].

Pavelski, Almeida, Goncalves SBRN 2012 12 of 34

Harmony Search for Multi-objective OptimizationBackground

HS and Variants

Harmony Search (HS)

Best state Global Optimum Fantastic HarmonyEstimated by Objective Function Aesthetic StandardEstimated with Values of Variables Pitches of InstrumentsProcess unit Each Iteration Each Practice

Pavelski, Almeida, Goncalves SBRN 2012 13 of 34

Harmony Search for Multi-objective OptimizationBackground

HS and Variants

Harmony Search Algorithm1: function HarmonySearch2: /* 1. Harmony Memory Initialization */3: HM = xi ∈ Ω, i ∈ (1, . . . ,HMS)4: for t = 0, . . . ,NI do5: /* 2. Improvisation */6: xnew = improvise(HM)7: /* 3. Memory Update */8: xworst = minxi f (xi ), xi ∈ HM9: if f (xnew) > f (xworst) then

10: HM = (HM ∪ xnew) \ xworst11: end if12: end for13: end function

Pavelski, Almeida, Goncalves SBRN 2012 14 of 34

Harmony Search for Multi-objective OptimizationBackground

HS and Variants

Harmony Search – Improvise Method1: function Improvise(HM) : xnew

2: for i = 0, . . . , n do3: if r1 < HMCR then4: /* 1. Memory Consideration */5: xnew

i = xki , k ∈ (1, . . . , HMS)

6: if r2 < PAR then7: /* 2. Pitch Adjustment */8: xnew

i = xnewi ± r3 × BW

9: end if10: else11: /* 3. Random Selection */12: xnew

i = x (L)i + r × (x (U)

i − x (L)i )

13: end if14: end for15: end function

Pavelski, Almeida, Goncalves SBRN 2012 15 of 34

Harmony Search for Multi-objective OptimizationBackground

HS and Variants

Harmony Search Variants

I HS: regular Harmony Search algorithmI IHS: Improved Harmony SearchI GHS: Global-best Harmony SearchI SGHS: Self-adaptive Global-best Harmony SearchI . . .

Pavelski, Almeida, Goncalves SBRN 2012 16 of 34

Harmony Search for Multi-objective OptimizationBackground

HS and Variants

Improved Harmony Search (IHS)

Fine adjustment of the parameters PAR and BW[Mahdavi, Fesanghary e Damangir 2007]:

PAR(t) = PARmin +(PARmax − PARmin)

NI × t (1)

BW (t) = BW max exp( ln

(BW minBW max

)NI × t

)(2)

Pavelski, Almeida, Goncalves SBRN 2012 17 of 34

Harmony Search for Multi-objective OptimizationBackground

HS and Variants

Global-best Harmony Search (GHS)

Inspired by swarm intelligence approaches, involves the bestharmony in the improvisation of new ones[Omran e Mahdavi 2008]:

function Improvise(HM) : xnew

. . .if r2 < PAR then

/* Pitch Adjustment */xnew

i = xbestk , k ∈ (1, . . . , n)

end if. . .

end function

Pavelski, Almeida, Goncalves SBRN 2012 18 of 34

Harmony Search for Multi-objective OptimizationBackground

HS and Variants

Self-adaptive Global-best Harmony Search (SGHS)Involves the best harmony and provides self-adaptation to thePAR and HMCR parameters [Pan et al. 2010]:

function Improvise(HM) : xnew

. . .if r1 < HMCR then

xnewi = xk

i ± r × BW , k ∈ (1, . . . , HMS)if r2 < PAR then

xnewi = xbest

iend if

end if. . .

end function

BW (t) =

BW max − BW max −BW min

NI if t < NI/2,BW min otherwise (3)

Pavelski, Almeida, Goncalves SBRN 2012 19 of 34

Harmony Search for Multi-objective OptimizationProposed Algorithms

Introduction

BackgroundMulti-objective Optimization and MOEAsHarmony Search and Variants

Proposed Algorithms

Experimental Results

Conclusions

Pavelski, Almeida, Goncalves SBRN 2012 20 of 34

Harmony Search for Multi-objective OptimizationProposed Algorithms

Non-dominated Sorting Harmony Search – NSHS1: function nshs2: HM = xi ∈ Ω, i ∈ (1, . . . ,HMS)3: for j = 0, . . . ,NI/HMS do4: HMnew = ∅5: for k = 0, . . . ,HMS do6: xnew = improvise(HM)7: HMnew = HMnew ∪ xnew8: end for9: HM = HM ∪ HMnew

10: HM = NondominatedSorting(HM)11: end for12: end function

Pavelski, Almeida, Goncalves SBRN 2012 21 of 34

Harmony Search for Multi-objective OptimizationProposed Algorithms

Non-dominated Sorting Harmony Search – NSHS

I A different selection scheme: memory is doubledand non-dominated sorting + crowding distanceare applied

I NSIHS: t is the amount of harmonies improvisedI NSGHS: xbest

i is a random non-dominated solutionI NSSGHS: lp = HMS (a generation), learning from

solutions where cd > 0

Pavelski, Almeida, Goncalves SBRN 2012 22 of 34

Harmony Search for Multi-objective OptimizationProposed Algorithms

Non-dominated Sorting Harmony Search – NSHS

I A different selection scheme: memory is doubled andnon-dominated sorting + crowding distance are applied

I NSIHS: t is the amount of harmonies improvisedI NSGHS: xbest

i is a random non-dominated solutionI NSSGHS: lp = HMS (a generation), learning from

solutions where cd > 0

Pavelski, Almeida, Goncalves SBRN 2012 22 of 34

Harmony Search for Multi-objective OptimizationProposed Algorithms

Non-dominated Sorting Harmony Search – NSHS

I A different selection scheme: memory is doubled andnon-dominated sorting + crowding distance are applied

I NSIHS: t is the amount of harmonies improvisedI NSGHS: xbest

i is a random non-dominated solutionI NSSGHS: lp = HMS (a generation), learning from

solutions where cd > 0

Pavelski, Almeida, Goncalves SBRN 2012 22 of 34

Harmony Search for Multi-objective OptimizationProposed Algorithms

Non-dominated Sorting Harmony Search – NSHS

I A different selection scheme: memory is doubled andnon-dominated sorting + crowding distance are applied

I NSIHS: t is the amount of harmonies improvisedI NSGHS: xbest

i is a random non-dominated solutionI NSSGHS: lp = HMS (a generation), learning from

solutions where cd > 0

Pavelski, Almeida, Goncalves SBRN 2012 22 of 34

Harmony Search for Multi-objective OptimizationResults

Introduction

BackgroundMulti-objective Optimization and MOEAsHarmony Search and Variants

Proposed Algorithms

Experimental Results

Conclusions

Pavelski, Almeida, Goncalves SBRN 2012 23 of 34

Harmony Search for Multi-objective OptimizationResults

Problems

I 10 unconstrained (bound constrained) problems:UF1, UF2, . . . , UF10

I Taken from CEC 2009 [Zhang et al. 2009]I Difficult to solve, with different characteristicsI n = 30 variables. UF1, UF2, . . . , UF7: 2 objetives. UF8,

. . . , UF10: 3 objetives

Pavelski, Almeida, Goncalves SBRN 2012 24 of 34

Harmony Search for Multi-objective OptimizationResults

Problems

I 10 unconstrained (bound constrained) problems: UF1,UF2, . . . , UF10

I Taken from CEC 2009 [Zhang et al. 2009]I Difficult to solve, with different characteristicsI n = 30 variables. UF1, UF2, . . . , UF7: 2 objetives. UF8,

. . . , UF10: 3 objetives

Pavelski, Almeida, Goncalves SBRN 2012 24 of 34

Harmony Search for Multi-objective OptimizationResults

Problems

I 10 unconstrained (bound constrained) problems: UF1,UF2, . . . , UF10

I Taken from CEC 2009 [Zhang et al. 2009]I Difficult to solve, with different characteristicsI n = 30 variables. UF1, UF2, . . . , UF7: 2 objetives. UF8,

. . . , UF10: 3 objetives

Pavelski, Almeida, Goncalves SBRN 2012 24 of 34

Harmony Search for Multi-objective OptimizationResults

Problems

I 10 unconstrained (bound constrained) problems: UF1,UF2, . . . , UF10

I Taken from CEC 2009 [Zhang et al. 2009]I Difficult to solve, with different characteristicsI n = 30 variables. UF1, UF2, . . . , UF7: 2 objetives.

UF8, . . . , UF10: 3 objetives

Pavelski, Almeida, Goncalves SBRN 2012 24 of 34

Harmony Search for Multi-objective OptimizationResults

Parameters30 executions, 150.000 objective functions evaluations,population size or HMS of 200

HMCR PAR BWNSHS 0.95 0.10 0.01 ∗∆xNSIHS 0.95 PARmin = 0.01 BW min = 0.0001

PARmax = 0.20 BW max = 0.05 ∗∆xNSGHS 0.95 PARmin = 0.01 -

PARmax = 0.40NSSGHS 0.95 0.90 BW min = 0.001

BW max = 0.10 ∗∆xNSGA-II: polynomial mutation with probability 1/n and SBXcrossover with probability 0.7.

Pavelski, Almeida, Goncalves SBRN 2012 25 of 34

Harmony Search for Multi-objective OptimizationResults

Quality Indicators and Statistical Tests

I Non-parametric tests, PISA framework[Zitzler, Knowles e Thiele 2008]

I Mann-Whitney and dominance rankingI Quality indicators: hypervolume, additive unary-ε and R2

I Overall performance of each algorithm(macro-evaluation): Mack-Skillings variation of theFriedman test [Mack e Skillings 1980]

I Each algorithm, each instance (micro-evaluation):Kruskal-Wallis test

Pavelski, Almeida, Goncalves SBRN 2012 26 of 34

Harmony Search for Multi-objective OptimizationResults

Quality Indicators and Statistical Tests

I Non-parametric tests, PISA framework[Zitzler, Knowles e Thiele 2008]

I Mann-Whitney and dominance rankingI Quality indicators: hypervolume, additive unary-ε and R2

I Overall performance of each algorithm(macro-evaluation): Mack-Skillings variation of theFriedman test [Mack e Skillings 1980]

I Each algorithm, each instance (micro-evaluation):Kruskal-Wallis test

Pavelski, Almeida, Goncalves SBRN 2012 26 of 34

Harmony Search for Multi-objective OptimizationResults

Quality Indicators and Statistical Tests

I Non-parametric tests, PISA framework[Zitzler, Knowles e Thiele 2008]

I Mann-Whitney and dominance rankingI Quality indicators: hypervolume, additive unary-ε

and R2

I Overall performance of each algorithm(macro-evaluation): Mack-Skillings variation of theFriedman test [Mack e Skillings 1980]

I Each algorithm, each instance (micro-evaluation):Kruskal-Wallis test

Pavelski, Almeida, Goncalves SBRN 2012 26 of 34

Harmony Search for Multi-objective OptimizationResults

Quality Indicators and Statistical Tests

I Non-parametric tests, PISA framework[Zitzler, Knowles e Thiele 2008]

I Mann-Whitney and dominance rankingI Quality indicators: hypervolume, additive unary-ε and R2

I Overall performance of each algorithm(macro-evaluation): Mack-Skillings variation of theFriedman test [Mack e Skillings 1980]

I Each algorithm, each instance (micro-evaluation):Kruskal-Wallis test

Pavelski, Almeida, Goncalves SBRN 2012 26 of 34

Harmony Search for Multi-objective OptimizationResults

Quality Indicators and Statistical Tests

I Non-parametric tests, PISA framework[Zitzler, Knowles e Thiele 2008]

I Mann-Whitney and dominance rankingI Quality indicators: hypervolume, additive unary-ε and R2

I Overall performance of each algorithm(macro-evaluation): Mack-Skillings variation of theFriedman test [Mack e Skillings 1980]

I Each algorithm, each instance (micro-evaluation):Kruskal-Wallis test

Pavelski, Almeida, Goncalves SBRN 2012 26 of 34

Harmony Search for Multi-objective OptimizationResults

Kurskal-Wallis test for hypervolumeNSHS NSHS NSHS NSIHS NSIHS NSGHS

x x x x x x

NSIHS NSGHS NSSGHS NSGHS NSSGHS NSSGHS

UF1 0.19 0.42 1.0 0.75 1.0 1.0UF2 0.65 0.21 1.0 0.12 1.0 1.0UF3 0.09 0.0 0.0 0.0 0.0 0.0UF4 0.94 0.0 0.96 0.0 0.59 1.0UF5 0.07 0.0 0.0 0.0 0.0 0.07UF6 0.5 0.5 0.5 0.5 0.5 0.5UF7 0.02 0.02 0.8 0.55 1.0 1.0UF8 0.25 1.0 0.0 1.0 0.0 0.0UF9 0.97 0.11 0.07 0.0 0.0 0.41UF10 0.0 0.0 0.0 0.25 0.01 0.04

Pavelski, Almeida, Goncalves SBRN 2012 27 of 34

Harmony Search for Multi-objective OptimizationResults

Quality Indicators and Statistical Tests

I NSHS was among the best algorithms for solvingUF3, UF5, UF6, UF7, UF9 and UF10.

I NSIHS, many times incomparable to NSHS, had a goodperformance on in UF3, UF4, UF5, UF6 and UF9.

I NSGHS obtained good results on the 3 objectiveproblems, namely UF8, UF9 and UF10.

I NSSGHS performed well on UF1, UF4 and UF7.

Pavelski, Almeida, Goncalves SBRN 2012 28 of 34

Harmony Search for Multi-objective OptimizationResults

Quality Indicators and Statistical Tests

I NSHS was among the best algorithms for solving UF3,UF5, UF6, UF7, UF9 and UF10.

I NSIHS, many times incomparable to NSHS, had agood performance on in UF3, UF4, UF5, UF6 andUF9.

I NSGHS obtained good results on the 3 objectiveproblems, namely UF8, UF9 and UF10.

I NSSGHS performed well on UF1, UF4 and UF7.

Pavelski, Almeida, Goncalves SBRN 2012 28 of 34

Harmony Search for Multi-objective OptimizationResults

Quality Indicators and Statistical Tests

I NSHS was among the best algorithms for solving UF3,UF5, UF6, UF7, UF9 and UF10.

I NSIHS, many times incomparable to NSHS, had a goodperformance on in UF3, UF4, UF5, UF6 and UF9.

I NSGHS obtained good results on the 3 objectiveproblems, namely UF8, UF9 and UF10.

I NSSGHS performed well on UF1, UF4 and UF7.

Pavelski, Almeida, Goncalves SBRN 2012 28 of 34

Harmony Search for Multi-objective OptimizationResults

Quality Indicators and Statistical Tests

I NSHS was among the best algorithms for solving UF3,UF5, UF6, UF7, UF9 and UF10.

I NSIHS, many times incomparable to NSHS, had a goodperformance on in UF3, UF4, UF5, UF6 and UF9.

I NSGHS obtained good results on the 3 objectiveproblems, namely UF8, UF9 and UF10.

I NSSGHS performed well on UF1, UF4 and UF7.

Pavelski, Almeida, Goncalves SBRN 2012 28 of 34

Harmony Search for Multi-objective OptimizationResults

Comparison against NSGA-II (Mann-Whitney)Hyp. Unary-ε R2

UF1 0.11 0.00 0.08UF2 1.00 0.03 1.00UF3 0.00 0.00 0.00UF4 1.00 1.00 1.00UF5 0.00 0.00 0.00UF6 0.00 0.00 0.00UF7 0.54 0.45 0.57UF8 0.00 0.00 0.00UF9 1.00 0.02 0.64

UF10 0.00 0.00 0.00

MS Friedman test: criticaldifference of 2.795.I Hypervolume: 28.87 for

NSHS and 32.13 forNSGA-II

I Unary-ε: 23.57 forNSHS and 37.43 forNSGA-II

I R2: 27.83 for NSHSand 33.16 for NSGA-II

Pavelski, Almeida, Goncalves SBRN 2012 29 of 34

Harmony Search for Multi-objective OptimizationConclusions

Introduction

BackgroundMulti-objective Optimization and MOEAsHarmony Search and Variants

Proposed Algorithms

Experimental Results

Conclusions

Pavelski, Almeida, Goncalves SBRN 2012 30 of 34

Harmony Search for Multi-objective OptimizationConclusions

Conclusions

I Objectives: propose hybridization of four HSversions with the NSGA-II framework, runbenchmark functions used in CEC 2009, evaluateresults with quality indicators

I Tests showed that NSHS, the original HS algorithm usingnon-dominated sorting, was the best among all proposedmulti-objective versions

I NSHS algorithm was favorably compared with the originalNSGA-II

Pavelski, Almeida, Goncalves SBRN 2012 31 of 34

Harmony Search for Multi-objective OptimizationConclusions

Conclusions

I Objectives: propose hybridization of four HS versions withthe NSGA-II framework, run benchmark functions used inCEC 2009, evaluate results with quality indicators

I Tests showed that NSHS, the original HS algorithmusing non-dominated sorting, was the best amongall proposed multi-objective versions

I NSHS algorithm was favorably compared with the originalNSGA-II

Pavelski, Almeida, Goncalves SBRN 2012 31 of 34

Harmony Search for Multi-objective OptimizationConclusions

Conclusions

I Objectives: propose hybridization of four HS versions withthe NSGA-II framework, run benchmark functions used inCEC 2009, evaluate results with quality indicators

I Tests showed that NSHS, the original HS algorithm usingnon-dominated sorting, was the best among all proposedmulti-objective versions

I NSHS algorithm was favorably compared with theoriginal NSGA-II

Pavelski, Almeida, Goncalves SBRN 2012 31 of 34

Harmony Search for Multi-objective OptimizationConclusions

Future works

I Effects of other HS variants and parameter valuesin problems with different characteristics

I Analysis of different aspects: computational effort, andcomparisons against other MOEAs, etc

I Adaptation of HS operators on other state-of-artframeworks (in progress)

Pavelski, Almeida, Goncalves SBRN 2012 32 of 34

Harmony Search for Multi-objective OptimizationConclusions

Future works

I Effects of other HS variants and parameter values inproblems with different characteristics

I Analysis of different aspects: computational effort,and comparisons against other MOEAs, etc

I Adaptation of HS operators on other state-of-artframeworks (in progress)

Pavelski, Almeida, Goncalves SBRN 2012 32 of 34

Harmony Search for Multi-objective OptimizationConclusions

Future works

I Effects of other HS variants and parameter values inproblems with different characteristics

I Analysis of different aspects: computational effort, andcomparisons against other MOEAs, etc

I Adaptation of HS operators on other state-of-artframeworks (in progress)

Pavelski, Almeida, Goncalves SBRN 2012 32 of 34

Bibliographic ReferencesCOELLO, C. A. C.; LAMONT, G. B.; VELDHUIZEN, D. A. V. Evolutionary Algorithms for SolvingMulti-Objective Problems. 2. ed. USA: Springer, 2007.

DEB, K. Multi-Objective Optimization Using Evolutionary Algorithms: An Introduction. [S.l.], 2011.

DEB, K. et al. A Fast Elitist Non-dominated Sorting Genetic Algorithm for Multi-objective Optimisation:NSGA-II. In: Proceedings of the 6th International Conference on Parallel Problem Solving from Nature.London, UK, UK: Springer-Verlag, 2000. (PPSN VI), p. 849–858. ISBN 3-540-41056-2.

GEEM, Z. W.; KIM, J. H.; LOGANATHAN, G. A new heuristic optimization algorithm: Harmony search.SIMULATION, v. 76, n. 2, p. 60–68, 2001.

MACK, G. A.; SKILLINGS, J. H. A friedman-type rank test for main effects in a two-factor anova. Journal ofthe American Statistical Association, v. 75, n. 372, p. 947–951, 1980.

MAHDAVI, M.; FESANGHARY, M.; DAMANGIR, E. An improved harmony search algorithm for solvingoptimization problems. Applied Mathematics and Computation, v. 188, n. 2, p. 1567–1579, maio 2007.

OMRAN, M.; MAHDAVI, M. Global-best harmony search. Applied Mathematics and Computation, v. 198,n. 2, p. 643–656, 2008.

PAN, Q.-K. et al. A self-adaptive global best harmonysearch algorithm for continuous optimization problems.Applied Mathematics and Computation, v. 216, n. 3, p. 830 – 848, 2010.

ZHANG, Q. et al. Multiobjective optimization test instances for the cec 2009 special session and competition.Mechanical Engineering, CEC2009, n. CES-487, p. 1–30, 2009.

ZITZLER, E. Evolutionary Algorithms for Multiobjective Optimization: Methods and Applications. Tese(Doutorado) — ETH Zurich, Switzerland, 1999.

ZITZLER, E.; KNOWLES, J.; THIELE, L. Quality assessment of pareto set approximations. Springer-Verlag,Berlin, Heidelberg, p. 373–404, 2008.

Acknowledgments

I Fundacao AraucariaI UNICENTROI Friends and colleagues

Thank you for your attention!Questions?

Acknowledgments

I Fundacao AraucariaI UNICENTROI Friends and colleagues

Thank you for your attention!Questions?