Multi-criteria meta-parameter tuning for mono-objective stochastic metaheuristics
-
Upload
guest78b81 -
Category
Technology
-
view
404 -
download
0
Transcript of Multi-criteria meta-parameter tuning for mono-objective stochastic metaheuristics
"Tha
les
con
f ide
ntia
l. A
ll rig
hts
r ese
rved
"
Research & Technology
Multi-objective meta-parameter tuning for mono-objective stochastic metaheuristics
Johann DréoTHALES Research & Technology
2
Introduction
Multi-objective method
Parameter tuning
Stochastic metaheuristics
Performance profiles
http://www.flickr.com/photos/k23/2792398403/
Dreo & Siarry, 2004
3
Stochastic metaheuristics
4
Examples of stochastic metaheuristics
5
Parameter setting
6
Meta-parameter tuning
7
As a mono-objective problem
Parameter setting:Improve performance
http://www.flickr.com/photos/sigfrid/223626315/
8
As a multi-objective problem
Parameter setting:
What is performance ?
→ multi-objective problem
http://www.flickr.com/photos/jesusdq/345379863/
9
Multi-objective problem
Performance ? Precision
Speed
Robustness Precision
Speed
Stability (← benchmark)
http://www.flickr.com/photos/matthewfch/1688409628/
10
Multi-objective problem
Performance ? Precision
Speed
Robustness Precision
Speed
Stability (← benchmark)
11
Meta-parameter tuning
Mono-objective problem
Stochastic metaheuristic
12
Multi-objective parameter tuning problem
Meta-parameter tuning
Mono-objective problem
Stochastic metaheuristic
13
Multi-objective parameter tuning problem
Meta-parameter tuning
Mono-objective problem
Stochastic metaheuristic
Meta-optimizer
14
Complexity
Multi-objective parameter tuning problem
Mono-objective problem
Stochastic metaheuristic
Meta-optimizer
Difficult
Easier1 time
15
Methodology
Speed / PrecisionMedian estimation
Mono-objective problem
Stochastic metaheuristic
NSGA-2
16
Methodology
Speed / PrecisionMedian estimation
Mono-objective problem
Stochastic metaheuristic
NSGA-2
17
Results plots
Speed
Precision
Performance profile / front
18
Some results
19
Example
2 continuous EDA (CEDA, CHEDA)
Sampling density parameter
Rosenbrock, 2 dimensions
Median estimated with 10 runs
10 000 max eval.
NSGA-2
20 iter., 50 indiv.
10 runs
3 days computation
+ Nelder-Mead Search
20
Example
+ simulated annealing
stable temperature parameter
Rosenbrock, 2 dimensions
Median estimated with 10 runs
10 000 max eval.
NSGA-2
20 iter., 50 indiv.
10 runs
1 day computation
21
Example
+ genetic algorithm
population parameter
Rosenbrock, 2 dimensions
Median estimated with 10 runs
10 000 max eval.
NSGA-2
20 iter., 50 indiv.
10 runs
1 day computation
22
SA
JGEN
CEDA
CHEDA
Speed Precision
23
Behaviour exploration
Speed Precision
Genetic algorithm
Population size
24
Performance front
Temporal planner, ''Divide & Evolve > CPT'', version ''GOAL''
2 mutation parameters
IPC ''rovers'' problem, instance 06
Median estimated with 10 runs
NSGA-2
10 iter., 5 indiv.
30 runs
1 week computation for 1 run
25
Performance front in Parameters space
Speed
Precision
M1 M2
26
Previous parameters settings
27
Conclusion
28
Drawbacks
Computation cost
Stochastic M.-O. algo. → supplementary bias
http://www.flickr.com/photos/orvaratli/2690949652/
29
Drawbacks
Computation cost
Stochastic M.-O. algo. → supplementary bias
Valid only for: Algorithm implementation
Problem instance
Stopping criterion Error Time t steps, improvement < ε
http://www.flickr.com/photos/orvaratli/2690949652/
30
Drawbacks
Computation cost
Stochastic M.-O. algo. → supplementary bias
Valid only for: Algorithm implementation
Problem instance
Stopping criterion Error Time t steps, improvement < ε
Fronts often convex→ aggregations?
No benchmarking
http://www.flickr.com/photos/orvaratli/2690949652/
31
Advantages
Performance profiles Objectives space
Parameters space
Quantification of expert knowledge
32
Advantages
Performance profiles Objectives space
Parameters space
Quantification of expert knowledge
Automatic parameter tuning One step before use
N parameters → 1 parameter
More degrees of freedom
33
Advantages
Performance profiles Objectives space
Parameters space
Quantification of expert knowledge
Automatic parameter tuning One step before use
N parameters → 1 parameter
More degrees of freedom
Algorithms comparison Statistical tests more meaningful
34
Advantages
Performance profiles Objectives space
Parameters space
Quantification of expert knowledge
Automatic parameter tuning One step before use
N parameters → 1 parameter
More degrees of freedom
Algorithms comparison Statistical tests more meaningful
Behaviour understanding
35
Perspectives
Include robustness
Include dispersion estimation
Include benchmarking
Multi-objective SPO, F-Race
Regressions in parameters space Performances / parameters
Behaviour models?
Links? Fitness Landscape /
Performance profiles
Run time distribution
Taillard's significance plots
...
http://www.flickr.com/photos/colourcrazy/2065575762/