Self Organization Map

14
An introduction to Self Organization Map (SOM) By: zahra sadeghi

Transcript of Self Organization Map

Page 1: Self Organization Map

An introduction to

Self Organization Map

(SOM)

By: zahra sadeghi

Page 2: Self Organization Map

Cerebral cortex

• different sensory inputs are mapped onto

corresponding areas of the cerebral cortex

in an orderly fashion.

• Neurons, dealing with closely related

pieces of information, are close together

so that they can interact via short synaptic

connections.

Page 3: Self Organization Map

Cerebral

Cortex

Page 4: Self Organization Map

Cohonen model • invented by : Teuvo Kohonen

• humans simply cannot visualize high dimensional

data

• a data visualization technique

• represent multidimensional data in much lower

dimensional spaces - one or two dimensions.

• group similar data items together.

– any topological relationships within the training set are

maintained

• they reduce dimensions and display similarities

Page 5: Self Organization Map

SOM

• Is an unsupervised learning algorithm

• principle goal:

– to transform an input into a one or two

dimensional discrete map

– to perform this transformation adaptively in a

topologically ordered fashion.

Page 6: Self Organization Map

Network Architecture

• Feedforward structure

• Two layers

• The output layer consists of neurons

arranged in

1)Rows and columns (two dimensional)

2)A single column or row (1dimensional)

Page 7: Self Organization Map

Network Architecture

• Each node has a specific topological

position (an x, y coordinate in the lattice)

• and contains a vector of weights of the

same dimension as the input vectors.

- if the training data consists of vectors, V, of n dimensions: V1, V2,

V3...Vn

- Then each node will contain a corresponding weight vector W, of n

dimensions :W1, W2, W3...Wn

Page 8: Self Organization Map

.

•Each neuron in the lattice is fully connected to all the

source nodes in the input layer.

•There are no lateral connections between nodes within

the lattice

Page 9: Self Organization Map
Page 10: Self Organization Map

SOM algorithm

1.Each node's weights are initialized. – random initialization

– sample initialization

2.A vector is chosen at random from the set of training data and presented to the lattice.

3.Every node is examined to calculate which one's weights are most like the input vector.

4.The radius of the neighbourhood of the BMU is now calculated. Any nodes found within this radius are deemed to be inside the BMU's neighbourhood.

5.Each neighbouring node's weights are adjusted (The closer a node is to the BMU, the more its weights get altered.)

6.Repeat step 2 for N iterations.

Page 11: Self Organization Map

Best Matching Unit: BMU

• for a given training instance only a single neuron is activated called BMU – BMU is the neuron c whose weight vector has highest similarity

with training instance x

• d : Euclidean distance

• for every training instance the BMU and additional neurons in the neighborhood of the BMU are adjusted by the Kohonen-Learning-Rule

Page 12: Self Organization Map

Neighborhood

• the neighbourhood is centered around the BMU

• the area of the neighbourhood shrinks over time.

• This is accomplished by making the radius of the neighbourhood shrink over time

• initial radius of neighborhood may be equal to half the diameter of the SOM

• Over time the neighbourhood will shrink to the size of just one node... the BMU

Page 13: Self Organization Map

Adjusting the Weights

• Every node within the BMU's

neighbourhood (including the BMU) has its

weight vector adjusted

• L : learning rate, which decreases with

time

Page 14: Self Organization Map

Adjusting the Weights • the effect of learning should be proportional to the

distance a node is from the BMU.

• at the edges of the BMUs neighbourhood, the learning process should have barely any effect at all!

• the amount of learning should fade over distance similar to the Gaussian decay