Unsupervised Learning 1
Transcript of Unsupervised Learning 1
-
8/3/2019 Unsupervised Learning 1
1/31
Klinkhachorn:CpE520
Unsupervised Learning
-
8/3/2019 Unsupervised Learning 1
2/31
Klinkhachorn:CpE520
Unsupervised Learning
Introduction
-
8/3/2019 Unsupervised Learning 1
3/31
Klinkhachorn:CpE520
Classification of ANN
Paradigms
Unsupervised
-
8/3/2019 Unsupervised Learning 1
4/31
Klinkhachorn:CpE520
Supervised Learning Paradigms
Training data consists of both the input and
the desired output
weights are adjusted according to a difference between
desired output and actual output
The output for every input set is known
before training starts
Guarantees if you successfully train you know what
the network does!
Typical networks
Back Propagation..BAM
-
8/3/2019 Unsupervised Learning 1
5/31
Klinkhachorn:CpE520
Unsupervised Learning Paradiagms
Training data consists only of inputs to the
network
Outputs for training sets are unknown until
training is complete
Typical networks
Kohonen
Counter Propagation
Adaptive resonance Theory
-
8/3/2019 Unsupervised Learning 1
6/31
Klinkhachorn:CpE520
Unsupervised Learning: Example
Simple case Datax y
.1 .2.8 .9
.7 .7
.2 .1
.8 .8
.3 .1
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
0 0.2 0.4 0.6 0.8 1
Series1
-
8/3/2019 Unsupervised Learning 1
7/31
Klinkhachorn:CpE520
Unsupervised Learning: Example
Complex case
10 dimensional data set!
0,.1,.3,.6,.3,.7,.6,.4,.9,.2
1,.3,.9,.5,.7,.1,.3,.2,.8,.1
..
..
what??
Might need Unsupervised Learning!
-
8/3/2019 Unsupervised Learning 1
8/31
Klinkhachorn:CpE520
Unsupervised Learning Mechanisms
Competitive Learning
Neurons compete with each other to form output
Uses include Pattern Classifications
Lateral Inhibition
Neurons inhibit the output of nearby neurons
Uses include edge enhancement
Neighborhood Adaptations
Neurons and those nearby are adapted
Uses include self-organizing maps
-
8/3/2019 Unsupervised Learning 1
9/31
Klinkhachorn:CpE520
Semi-Supervised Paradigms
Descriptive term for networks that are not cleanly
supervised or unsupervised (Catch-all term)
Many possible forms Pre-processing of data to form clusters
Input/output specified but internally some
unsupervised learning takes place (Counter-
Propagation) Post-processing of data
-
8/3/2019 Unsupervised Learning 1
10/31
Klinkhachorn:CpE520
Unsupervised Neural Networks
-
8/3/2019 Unsupervised Learning 1
11/31
Klinkhachorn:CpE520
Competitive Learning Network
Input Pattern
PATTERN
A B C
Output
Classification
Competitive
Layer
InputLayer
1 1 0 1
-
8/3/2019 Unsupervised Learning 1
12/31
Klinkhachorn:CpE520
Competitive Learning Network
j
Wj1 Wj2 Wji
Output =
1 if Sj >Sk for all units
0, otherwise
Sj = S Xj * Wji
-
8/3/2019 Unsupervised Learning 1
13/31
Klinkhachorn:CpE520
Competitive Learning Algorithm Assign random weights to all neurons
(normalized, Siwji =1)
Calculate output of each neuron
Determine Winner
Adjust weights of Winner
Repeat for all pairs until convergence
-
8/3/2019 Unsupervised Learning 1
14/31
Klinkhachorn:CpE520
Competitive Learning Algorithm Adjust weights of Winner
DWji(t+1) = h*(Xi/m - Wji)
WhereWji(t+1) = new weight of neuron j from input i
h = learning coefficient (typically .1-.3)
Xi = value of input i
m = magnitude of input squared (sum of all 1s)Wji = old weight of neuron j from input i
-
8/3/2019 Unsupervised Learning 1
15/31
Klinkhachorn:CpE520
Competitive Learning: Example Two classes A & B
Two inputs X1 & X2
A
B
X1
X2
WA1
WA2
WB1
WB2
-
8/3/2019 Unsupervised Learning 1
16/31
Klinkhachorn:CpE520
Competitive Learning: Example (Cont) Let WA1 = 0.3, WA2 = 0.7,
WB1 = 0.6, WB2 = 0.4, and
h = 0.3
If X1 = 0, and X2 = 1 then
A = 0.3*0 + 0.7*1 = 0.7
and
B = 0.6*0 + 0.4 *1 = 0.4
-
8/3/2019 Unsupervised Learning 1
17/31
Klinkhachorn:CpE520
Competitive Learning: Example (Cont) A is the winner, then
WA1 = 0.3 + 0.3(0/1 -0.3) = 0.21
WA2 = 0.7 + 0.3(1/1-0.7) = 0.79
Repeat for all input pair until convergence
** SiDWji = h*(Si Xi/m - Si Wji) = 0= m =1
-
8/3/2019 Unsupervised Learning 1
18/31
Klinkhachorn:CpE520
Competitive Learning: Note This is a simple network and not guaranteed to
converge if the input set is not separated into
well define clusters or if a sufficient number ofoutput neurons are not provided!
The fewer the output neurons, the more general
classification while more output neurons yield
more specialized classes.
-
8/3/2019 Unsupervised Learning 1
19/31
Klinkhachorn:CpE520
Competitive Learning
: A Simplified Pattern Classification Example
P1 = (101)
P2 = (100)
P3 = (010)
P4 = (011)
Training Set
A
B
Network
Configuration
-
8/3/2019 Unsupervised Learning 1
20/31
Klinkhachorn:CpE520
Competitive Learning
: A Simplified Pattern Classification Example
P1 = (101)
P2 = (100)
P3 = (010)
P4 = (011)
A
B
Class A
Class B
0132P41023P3
3201P2
2310P1P4P3P2P1
Hamming Distances
-
8/3/2019 Unsupervised Learning 1
21/31
Klinkhachorn:CpE520
A Simple Competitive Learning Example
-
8/3/2019 Unsupervised Learning 1
22/31
Klinkhachorn:CpE520
Training Set
Training set consisted of all possible pairs of adjacent points
-
8/3/2019 Unsupervised Learning 1
23/31
Klinkhachorn:CpE520
Resulting Classification by trained network
-
8/3/2019 Unsupervised Learning 1
24/31
Klinkhachorn:CpE520
Classification of an irregular grid
-
8/3/2019 Unsupervised Learning 1
25/31
Klinkhachorn:CpE520
Competitive Learning Notes
Useful as simple pattern classifier
Can be used in more complexnetworks
Input pattern must be well organized
Unorganized input pattern can lead tounstable behavior
-
8/3/2019 Unsupervised Learning 1
26/31
Klinkhachorn:CpE520
Competition and Inhibition Network
Max Network Like Hamming
Only one has output of 1, rest are 0
-
8/3/2019 Unsupervised Learning 1
27/31
-
8/3/2019 Unsupervised Learning 1
28/31
Klinkhachorn:CpE520
Lateral Inhibition Network
-
8/3/2019 Unsupervised Learning 1
29/31
Klinkhachorn:CpE520
Lateral Inhibition Network
-
8/3/2019 Unsupervised Learning 1
30/31
-
8/3/2019 Unsupervised Learning 1
31/31
Klinkhachorn:CpE520
Lateral Inhibition Notes
Inhibits nearby units only
Enhances peaks and edges of patterns Requires tuning of inhibitory and
excitory parameters for desired results