Lecture 1 An Introduction to Optimization -- …qf-zhao/TEACHING/MH/Lec01.pdfUn-constrained...

18
Lecture 1 An Introduction to Optimization – Classification and Case Study An Introduction to Metaheuristics: Produced by Qiangfu Zhao (Since 2012), All rights reserved (C) Lec01/1

Transcript of Lecture 1 An Introduction to Optimization -- …qf-zhao/TEACHING/MH/Lec01.pdfUn-constrained...

Page 1: Lecture 1 An Introduction to Optimization -- …qf-zhao/TEACHING/MH/Lec01.pdfUn-constrained Optimization β€’Usually, π‘₯is a β€œpoint” in an N-dimensional Euclidean space 𝑅 ,

Lecture 1An Introduction to Optimization –

Classification and Case Study

An Introduction to Metaheuristics: Produced by Qiangfu Zhao (Since 2012), All rights reserved (C) Lec01/1

Page 2: Lecture 1 An Introduction to Optimization -- …qf-zhao/TEACHING/MH/Lec01.pdfUn-constrained Optimization β€’Usually, π‘₯is a β€œpoint” in an N-dimensional Euclidean space 𝑅 ,

Un-constrained Optimization

β€’ Generally speaking, an optimization problem has an objective function f(x).

β€’ The problem is represented by

min(max) 𝑓(π‘₯), π‘“π‘œπ‘Ÿ π‘Žπ‘™π‘™ π‘₯

β€’ This is called an un-constrained optimization problem (η„‘εˆΆη΄„ζœ€ι©εŒ–ε•ι‘Œ).

An Introduction to Metaheuristics: Produced by Qiangfu Zhao (Since 2012), All rights reserved (C) Lec01/2

Page 3: Lecture 1 An Introduction to Optimization -- …qf-zhao/TEACHING/MH/Lec01.pdfUn-constrained Optimization β€’Usually, π‘₯is a β€œpoint” in an N-dimensional Euclidean space 𝑅 ,

Un-constrained Optimization

β€’ Usually, π‘₯ is a β€œpoint” in an N-dimensional Euclidean space 𝑅𝑁, and 𝑓(π‘₯) is a point in 𝑅𝑀.

β€’ In this course, we study only the case in which 𝑀 = 1. That is, we have only one objective to optimize.

β€’ Some special considerations are needed to extend the results obtained here to β€œmultiple objective” cases.

β€’ Interested students may also study optimization in β€œnon-Euclidean” spaces (i.e. manifolds).

An Introduction to Metaheuristics: Produced by Qiangfu Zhao (Since 2012), All rights reserved (C) Lec01/3

Page 4: Lecture 1 An Introduction to Optimization -- …qf-zhao/TEACHING/MH/Lec01.pdfUn-constrained Optimization β€’Usually, π‘₯is a β€œpoint” in an N-dimensional Euclidean space 𝑅 ,

Constrained Optimization

β€’ The domain can be a sub-space 𝐷 of 𝑅𝑁.

β€’ We have constrained optimization problem:

β€’ 𝐷 again can be defined by some functions

– π‘₯𝑖 > 0, 𝑖 = 1,2, …

– 𝑔𝑗(π‘₯) > 0, 𝑗 = 1,2, …

An Introduction to Metaheuristics: Produced by Qiangfu Zhao (Since 2012), All rights reserved (C)

Subject to

Lec01/4

β€’ min π‘šπ‘Žπ‘₯ 𝑓 π‘₯

β€’ 𝑠. 𝑑. π‘₯ ∈ 𝐷

Page 5: Lecture 1 An Introduction to Optimization -- …qf-zhao/TEACHING/MH/Lec01.pdfUn-constrained Optimization β€’Usually, π‘₯is a β€œpoint” in an N-dimensional Euclidean space 𝑅 ,

Linear programming(η·šεž‹θ¨ˆη”»ζ³•)

β€’ If both 𝑓(π‘₯) and 𝑔𝑗(π‘₯) are linear functions, we have linear optimization problem, and this is usually called linear programming (LP).

β€’ For LP, we have very efficient algorithms already, and meta-heuristics are not needed.

An Introduction to Metaheuristics: Produced by Qiangfu Zhao (Since 2012), All rights reserved (C) Lec01/5

Page 6: Lecture 1 An Introduction to Optimization -- …qf-zhao/TEACHING/MH/Lec01.pdfUn-constrained Optimization β€’Usually, π‘₯is a β€œpoint” in an N-dimensional Euclidean space 𝑅 ,

Non-linear programming(ιžη·šε½’θ¨ˆη”»ζ³•)

β€’ If 𝑓(π‘₯) or any 𝑔𝑗(π‘₯) is non-linear, we have non-linear optimization problem, and this is often called non-linear programming (NLP).

β€’ Many methods have been proposed to solve this class of problems.

β€’ However, conventional methods usually finds local optimal solutions. Meta-heuristic methods are useful for finding global solutions.

An Introduction to Metaheuristics: Produced by Qiangfu Zhao (Since 2012), All rights reserved (C) Lec01/6

Page 7: Lecture 1 An Introduction to Optimization -- …qf-zhao/TEACHING/MH/Lec01.pdfUn-constrained Optimization β€’Usually, π‘₯is a β€œpoint” in an N-dimensional Euclidean space 𝑅 ,

Local optimal and global optimal

β€’ For minimization problem, – A solution π‘₯βˆ— is local optimal if 𝑓(π‘₯βˆ—) < 𝑓(π‘₯) for all π‘₯ in

the πœ€-neighborhood of π‘₯βˆ—, where πœ– > 0 is a real number, and is the radius of the neighborhood.

– A solution π‘₯βˆ— is global optimal if 𝑓(π‘₯βˆ—) < 𝑓(π‘₯) for all π‘₯in the search space (problem domain).

β€’ Meta-heuristics are useful for obtaining global optimal solutions efficiently.

An Introduction to Metaheuristics: Produced by Qiangfu Zhao (Since 2012), All rights reserved (C) Lec01/7

Page 8: Lecture 1 An Introduction to Optimization -- …qf-zhao/TEACHING/MH/Lec01.pdfUn-constrained Optimization β€’Usually, π‘₯is a β€œpoint” in an N-dimensional Euclidean space 𝑅 ,

Example 1: Linear Programming

β€’ 2 materials are used for making two products. β€’ The prices of the products are 25 and 31 (in million yen), and those

of the materials are 0.5 and 0.8 (in million yen). β€’ Suppose that we produce x1 units for product1, and x2 units for

product2.β€’ We can get 25*x1+31*x2 million yen by selling the products.β€’ On the other hand, we must pay (7*x1+5*x2)*0.5 +

(4*x1+8*x2)*0.8 million yen to buy the materials.

An Introduction to Metaheuristics: Produced by Qiangfu Zhao (Since 2012), All rights reserved (C)

Material used in Product1 Material used in Product2

Material 1 7 5

Material 2 4 8

Lec01/8

Page 9: Lecture 1 An Introduction to Optimization -- …qf-zhao/TEACHING/MH/Lec01.pdfUn-constrained Optimization β€’Usually, π‘₯is a β€œpoint” in an N-dimensional Euclidean space 𝑅 ,

Example 1: Linear Programming

β€’ The problem can be formulated as follows:

max 𝑓(π‘₯1, π‘₯2) = 18.3π‘₯1 + 22.1π‘₯2𝑠. 𝑑. π‘₯1 > 0; π‘₯2 > 0;

6.7π‘₯1 + 8.9π‘₯2 < 𝐡

β€’ The first set of constraints means that both products should be produced to satisfy social needs; and the second constraint is the budget limitation.

β€’ This is a typical linear programming problem, and can be solved efficiently using the well-known simplex algorithm.

An Introduction to Metaheuristics: Produced by Qiangfu Zhao (Since 2012), All rights reserved (C) Lec01/9

Page 10: Lecture 1 An Introduction to Optimization -- …qf-zhao/TEACHING/MH/Lec01.pdfUn-constrained Optimization β€’Usually, π‘₯is a β€œpoint” in an N-dimensional Euclidean space 𝑅 ,

Example 2: Non-linear programming

β€’ Given 𝑁 observations: (π‘₯1, 𝑝(π‘₯1)), (π‘₯2, 𝑝(π‘₯2)), … ,(π‘₯𝑁, 𝑝(π‘₯𝑁)) of an unknown function 𝑝(π‘₯).

β€’ Find a polynomial π‘ž(π‘₯) = π‘Ž0+ π‘Ž1π‘₯ + π‘Ž2π‘₯2, such that

min 𝑓 π‘Ž0, π‘Ž1, π‘Ž2 =

𝑖=1

𝑁

𝑝 π‘₯𝑖 βˆ’ π‘ž π‘₯𝑖 + πœ† π‘ž(π‘₯)

β€’ Note that in this problem π‘ž(π‘₯) is also a function of π‘Ž0, π‘Ž1, π‘Žπ‘›π‘‘ π‘Ž2.

β€’ The first term is the approximation error, and the second term is regularization factor that can make the solution better (e.g. smoother).

An Introduction to Metaheuristics: Produced by Qiangfu Zhao (Since 2012), All rights reserved (C) Lec01/10

Page 11: Lecture 1 An Introduction to Optimization -- …qf-zhao/TEACHING/MH/Lec01.pdfUn-constrained Optimization β€’Usually, π‘₯is a β€œpoint” in an N-dimensional Euclidean space 𝑅 ,

Combinatorial optimization problems

β€’ If 𝑓(π‘₯) or 𝑔𝑗(π‘₯) cannot be given analytically (in closed-form), we have combinatorial problems.

β€’ For example, if π‘₯ takes π‘˜ discrete values (e.g. integers), and if there are 𝐾 variables, the number of all possible solutions will be π‘˜πΎ.

β€’ It is difficult to check all possible solutions in order to find the best one(s).

β€’ In such cases, meta-heuristics can provide efficient ways for obtaining good solutions using limited resources (e.g. time and memory space).

An Introduction to Metaheuristics: Produced by Qiangfu Zhao (Since 2012), All rights reserved (C) Lec01/11

Page 12: Lecture 1 An Introduction to Optimization -- …qf-zhao/TEACHING/MH/Lec01.pdfUn-constrained Optimization β€’Usually, π‘₯is a β€œpoint” in an N-dimensional Euclidean space 𝑅 ,

Example 3: Traveling salesman problem (TSP)

β€’ Given 𝑁 users located in 𝑁different places (cities).

β€’ The problem is to find a route so that the salesman can visit all users once (and only once), start from and return to his own place (to find the Hamiltonian cycle).

An Introduction to Metaheuristics: Produced by Qiangfu Zhao (Since 2012), All rights reserved (C)

From Wikipedia

Lec01/12

Page 13: Lecture 1 An Introduction to Optimization -- …qf-zhao/TEACHING/MH/Lec01.pdfUn-constrained Optimization β€’Usually, π‘₯is a β€œpoint” in an N-dimensional Euclidean space 𝑅 ,

Example 3: Traveling salesman problem (TSP)

β€’ In TSP, we have a route map which can be represented by a graph.

β€’ Each node is a user, and the edge between each pair of nodes has a cost (distance or time).

β€’ The evaluation function to be minimized is the total cost of the route.

An Introduction to Metaheuristics: Produced by Qiangfu Zhao (Since 2012), All rights reserved (C)

From Wikipedia

Lec01/13

For TSP, the number of all possible solutions is 𝑁!, and this is a well-known NP-hard combinatorial problem.

Page 14: Lecture 1 An Introduction to Optimization -- …qf-zhao/TEACHING/MH/Lec01.pdfUn-constrained Optimization β€’Usually, π‘₯is a β€œpoint” in an N-dimensional Euclidean space 𝑅 ,

NP-hard and NP-complete

β€’ Problems that can be solved by a deterministic algorithm in polynomial time is called class P.

β€’ NP is a class of decision problems that can be solved by a non-deterministic algorithm in polynomial time.

β€’ A problem H is NP-hard if it is at least as hard as any NP problem.

β€’ NP-hard decision problems are NP-complete.

β€’ NP-complete and NP-hard problems can be solved more efficiently if we use meta-heuristics.

An Introduction to Metaheuristics: Produced by Qiangfu Zhao (Since 2012), All rights reserved (C) Lec01/14

P

NP-complete

NP

NP-hard

Page 15: Lecture 1 An Introduction to Optimization -- …qf-zhao/TEACHING/MH/Lec01.pdfUn-constrained Optimization β€’Usually, π‘₯is a β€œpoint” in an N-dimensional Euclidean space 𝑅 ,

Example 4: The Knapsack problem

β€’ Knapsack problem is another NP-hard problem defined by:– There are 𝑁 objects;

– Each object has a weight and a value;

– The knapsack has a capacity;

– The user has a quota (minimum desired value);

An Introduction to Metaheuristics: Produced by Qiangfu Zhao (Since 2012), All rights reserved (C) Lec01/15

The problem is to find a sub-set of the objects that can be put into the knapsack and can maximize the total value.

Page 16: Lecture 1 An Introduction to Optimization -- …qf-zhao/TEACHING/MH/Lec01.pdfUn-constrained Optimization β€’Usually, π‘₯is a β€œpoint” in an N-dimensional Euclidean space 𝑅 ,

Example 4: The Knapsack problem KNAPSACK (in OS : set of objects; QUOTA : number; CAPACITY : number;

out S : set of objects; FOUND : boolean) Begin S := empty;

total_value := 0; total_weight := 0; FOUND := false; pick an order L over the objects; loop

choose an object O in L; add O to S; total_value:= total_value + O.value; total_weight:= total_weight + O.weight; if total_weight > CAPACITY then fail

else if total_value > = QUOTA FOUND:= true; succeed;

end enddelete all objects up to O from L;

end end

An Introduction to Metaheuristics: Produced by Qiangfu Zhao (Since 2012), All rights reserved (C) Lec01/16

This is a non-deterministic algorithm. Each time we run the program, we get a different answer. By chance, we may get the best answer.

Page 17: Lecture 1 An Introduction to Optimization -- …qf-zhao/TEACHING/MH/Lec01.pdfUn-constrained Optimization β€’Usually, π‘₯is a β€œpoint” in an N-dimensional Euclidean space 𝑅 ,

Example 5: Learning problems

β€’ Many optimization problems related to machine learning (learning from a given set of training data) are NP-hard/complete.

β€’ Examples included:– Finding the smallest feature sub-set;– Finding the most informative training

data set;– Finding the smallest decision tree;– Finding the best clusters;– Finding the best neural network;– Interpret a learned neural network.

An Introduction to Metaheuristics: Produced by Qiangfu Zhao (Since 2012), All rights reserved (C) Lec01/17

Page 18: Lecture 1 An Introduction to Optimization -- …qf-zhao/TEACHING/MH/Lec01.pdfUn-constrained Optimization β€’Usually, π‘₯is a β€œpoint” in an N-dimensional Euclidean space 𝑅 ,

Homework

β€’ Try to find some other examples of optimization problems (at least two) from the Internet.

β€’ Tell if the problems are NP-hard, NP-complete, NP, or P.

β€’ Provide a solution (not necessarily the best one) for each of the problems.

β€’ Summarize your answer using a pdf-file, and submit the printed copy before the class of next week.

An Introduction to Metaheuristics: Produced by Qiangfu Zhao (Since 2012), All rights reserved (C) Lec01/18