Optimization with Neural Networks Presented by: Mahmood Khademi Babak Bashiri Instructor: Dr....

30
Optimization with Neural Networks Presented by: Mahmood Khademi Babak Bashiri Instructor: Dr. Bagheri Sharif University of Technology April 2007

Transcript of Optimization with Neural Networks Presented by: Mahmood Khademi Babak Bashiri Instructor: Dr....

Page 1: Optimization with Neural Networks Presented by: Mahmood Khademi Babak Bashiri Instructor: Dr. Bagheri Sharif University of Technology April 2007.

Optimization with Neural Networks

Presented by:Mahmood Khademi

Babak BashiriInstructor:

Dr. Bagheri

Sharif University of TechnologyApril 2007

Page 2: Optimization with Neural Networks Presented by: Mahmood Khademi Babak Bashiri Instructor: Dr. Bagheri Sharif University of Technology April 2007.

Introduction

An optimization problem consists of two parts: Cost function and Constraints

Constrained The constraints are built in the cost function, so minimizing

the cost function also satisfies the constraints

Unconstraint There is no constraint for the problem!

Combinatorial We separate the constraints and the cost function, minimize

each of them and then add them together

Page 3: Optimization with Neural Networks Presented by: Mahmood Khademi Babak Bashiri Instructor: Dr. Bagheri Sharif University of Technology April 2007.

ApplicationApplications in many fields like:

Routing in computer networks VLSI circuit design Planning in operational and logistic systems Power distribution systems Wireless and satellite communication systems

Page 4: Optimization with Neural Networks Presented by: Mahmood Khademi Babak Bashiri Instructor: Dr. Bagheri Sharif University of Technology April 2007.

Basic idea

If : decision variables Suppose is our objective function . Constraints can be expressed as nonnegative penalty

functions that only when represent a feasible solution

By combining the penalty functions with F , the original constrained problem may be reformulated as unconstrained problem in which the goal is to minimize the quantity :

nXXX ,...,, 21

)21 ,...,,( nXXXFnXXX ,...,, 21

),...,,( 21 ni XXXC0),...,,( 21 ni XXXC

m

knkn XXXCXXXFF

12121 ),...,,(),...,,(

Page 5: Optimization with Neural Networks Presented by: Mahmood Khademi Babak Bashiri Instructor: Dr. Bagheri Sharif University of Technology April 2007.

TSP Is simple to state but very difficult to

solve. The problem is to find the shortest

possible tour through a set of N vertices so that each vertex is visited exactly once.

This problem is known to be NP-complete

Page 6: Optimization with Neural Networks Presented by: Mahmood Khademi Babak Bashiri Instructor: Dr. Bagheri Sharif University of Technology April 2007.

Why neural network?

Drawbacks of conventional computing systems: Perform poorly on complex problems Lack the computational power Don’t utilize the inherent parallelism of problems

Advantages of artificial neural networks: Perform well even on complex problems Very fast computational cycles if implemented in hardware Can take the advantage of inherent parallelism of problems

Page 7: Optimization with Neural Networks Presented by: Mahmood Khademi Babak Bashiri Instructor: Dr. Bagheri Sharif University of Technology April 2007.

Some Efforts to solve optimization problems

Many ANN algorithms with different architectures have been used to solve different optimization problems…

We’ve selected: Hopfield NN Elastic Net Self Organizing Map NN

Page 8: Optimization with Neural Networks Presented by: Mahmood Khademi Babak Bashiri Instructor: Dr. Bagheri Sharif University of Technology April 2007.

Hopfield-Tank model TSP must be mapped, in some way, onto the

neural network structure Each row corresponds to a particular city and

each column to a particular position in the tour

Page 9: Optimization with Neural Networks Presented by: Mahmood Khademi Babak Bashiri Instructor: Dr. Bagheri Sharif University of Technology April 2007.

Mapping TSP to Hopfield neural net There is a connection between each pair of

units The signal sent along a connection from i

to t j is equal to the weight Tij if i is activated. It is equal to 0 otherwise.

A negative weight defines inhibitory connection between the two units

It is unlikely that two units with negative weigh will be active or “on” at the same time

Page 10: Optimization with Neural Networks Presented by: Mahmood Khademi Babak Bashiri Instructor: Dr. Bagheri Sharif University of Technology April 2007.

Discrete Hopfield Model connection weights are not learned Hopfield network evolves by updating the

activation of each unit in turn In final state, all units are stable according to the

update rule The units are updated at random, one unit at a

time

{Vi}i=1,...,L, L :number of units Vi :activation level of unit iTij: connection weight between units i and jtetai: threshold of unit i.

Page 11: Optimization with Neural Networks Presented by: Mahmood Khademi Babak Bashiri Instructor: Dr. Bagheri Sharif University of Technology April 2007.

Discrete Hopfield Model (Cont.)

Energy function

Units changes its activation level if and only if the energy of the network decreases by doing so:

Since the energy can only decrease over time and the number configuration is finite

the network must converge (but not necessarily the minimum energy state)

Page 12: Optimization with Neural Networks Presented by: Mahmood Khademi Babak Bashiri Instructor: Dr. Bagheri Sharif University of Technology April 2007.

Continuous Hopfield-Tank Neuron function is

continuous (Sigmoid function)

The evolution of the units over time is now characterized by the following differential equation :

Ui, Ii and Vi are the input, input bias, and activation level of unit I, respectively

Page 13: Optimization with Neural Networks Presented by: Mahmood Khademi Babak Bashiri Instructor: Dr. Bagheri Sharif University of Technology April 2007.

Continuous Hopfield-Tank Energy function

Discrete time approximation is applied to the equations of motion

Page 14: Optimization with Neural Networks Presented by: Mahmood Khademi Babak Bashiri Instructor: Dr. Bagheri Sharif University of Technology April 2007.

Application of the Hopfield-Tank Model to the TSP

Page 15: Optimization with Neural Networks Presented by: Mahmood Khademi Babak Bashiri Instructor: Dr. Bagheri Sharif University of Technology April 2007.

Application of the Hopfield-Tank model to the TSP(1)The TSP is represented as an N*N matrix(2) Energy function

(3)Bias and connection weights are derived

Page 16: Optimization with Neural Networks Presented by: Mahmood Khademi Babak Bashiri Instructor: Dr. Bagheri Sharif University of Technology April 2007.

Application of the Hopfield-Tank model to the TSP

Page 17: Optimization with Neural Networks Presented by: Mahmood Khademi Babak Bashiri Instructor: Dr. Bagheri Sharif University of Technology April 2007.

Results of Hopfield-Tank Hopfield and Tank were able to solve a

randomly generated 10-city,with parameter value :A=B=500,C=200,N=15.

They reported for 20 trails, network converge 16 times to feasible tours.

Half of those tours were one of two optimal tours

Page 18: Optimization with Neural Networks Presented by: Mahmood Khademi Babak Bashiri Instructor: Dr. Bagheri Sharif University of Technology April 2007.

The size of each black square indicates the value of the output of the corresponding neuron

Page 19: Optimization with Neural Networks Presented by: Mahmood Khademi Babak Bashiri Instructor: Dr. Bagheri Sharif University of Technology April 2007.

The main weaknesses of the original Hopfield-Tank model

Page 20: Optimization with Neural Networks Presented by: Mahmood Khademi Babak Bashiri Instructor: Dr. Bagheri Sharif University of Technology April 2007.

The main weaknesses of the original Hopfield-Tank model(d) Model plagued with the limitation of “hill-

climbing” approaches (e) Model does not guarantee feasibility

Page 21: Optimization with Neural Networks Presented by: Mahmood Khademi Babak Bashiri Instructor: Dr. Bagheri Sharif University of Technology April 2007.

The main weaknesses of the original Hopfield-Tank model

The positive points: Can easily implemented in hardware Can be applied to non-Euclidean TSPs

Page 22: Optimization with Neural Networks Presented by: Mahmood Khademi Babak Bashiri Instructor: Dr. Bagheri Sharif University of Technology April 2007.

Elastic net (Willshaw-Von der Malsburg)

Page 23: Optimization with Neural Networks Presented by: Mahmood Khademi Babak Bashiri Instructor: Dr. Bagheri Sharif University of Technology April 2007.

Elastic net

Page 24: Optimization with Neural Networks Presented by: Mahmood Khademi Babak Bashiri Instructor: Dr. Bagheri Sharif University of Technology April 2007.

Energy function for Elastic net

Page 25: Optimization with Neural Networks Presented by: Mahmood Khademi Babak Bashiri Instructor: Dr. Bagheri Sharif University of Technology April 2007.

The self organizing map The SOM are instances of “competitive

NN” , used by unsupervised learning system to classify data

Adjusting the weights Related to elastic net Differ of elastic net

Page 26: Optimization with Neural Networks Presented by: Mahmood Khademi Babak Bashiri Instructor: Dr. Bagheri Sharif University of Technology April 2007.

Competitive Network Group a set of I-dimensional input pattern

in to K cluster (K<=M)

Page 27: Optimization with Neural Networks Presented by: Mahmood Khademi Babak Bashiri Instructor: Dr. Bagheri Sharif University of Technology April 2007.

SOM in the TSP context A set of 2-dimensional coordinates must

be mapped onto a set of 1-dimensional positions in the tour

Page 28: Optimization with Neural Networks Presented by: Mahmood Khademi Babak Bashiri Instructor: Dr. Bagheri Sharif University of Technology April 2007.

SOM in the TSP context

Page 29: Optimization with Neural Networks Presented by: Mahmood Khademi Babak Bashiri Instructor: Dr. Bagheri Sharif University of Technology April 2007.

Different SOM based on that form Fort increased speed of convergence by reducing neighborhood and reducing modification to weights of neighboring units over time. The work of Angeniol

Page 30: Optimization with Neural Networks Presented by: Mahmood Khademi Babak Bashiri Instructor: Dr. Bagheri Sharif University of Technology April 2007.

Questions ?