Chapter 5. Adaptive Resonance Theory (ART) ART1: for binary patterns; ART2: for continuous patterns...
-
Upload
garry-long -
Category
Documents
-
view
216 -
download
2
Transcript of Chapter 5. Adaptive Resonance Theory (ART) ART1: for binary patterns; ART2: for continuous patterns...
Chapter 5. Adaptive Resonance Theory (ART)
• ART1: for binary patterns; ART2: for continuous patterns
• Motivations: Previous methods have the following problem:
1. Training is non-incremental: – with a fixed set of samples,
– adding new samples often requires re-train the network with the enlarged training set until a new stable state is reached.
2. Number of class nodes is pre-determined and fixed.– Under- and over- classification may result from training
– No way to add a new class node (unless these is a free class node happens to be close to the new input).
– Any new input x has to be classified into one of an existing classes (causing one to win), no matter how far away x is from the winner. no control of the degree of similarity.
• Ideas of ART model:– suppose the input samples have been appropriately classified
into k clusters (say by competitive learning).
– each weight vector is a representative (average) of all samples in that cluster.
– when a new input vector x arrives
1.Find the winner j among all k cluster nodes
2.Compare with x
if they are sufficiently similar (x resonates with class j),
then update based on
else, find/create a free class node and make x as its
first member.
jW
jW
jW jWx
• To achieve these, we need:– a mechanism for testing and determining similarity.
– a control for finding/creating new class nodes.
– need to have all operations implemented by units of local computation.
ART1 Architecture
ni sss 1
mj yyy 1
ijbjitRG2
)(1 aF
2F
connection full :)( and between connection wise-pair :)( to)(
unitscluster :units interface :)(
unitsinput :)(
12
11
2
1
1
bFFbFaF
FbFaF
olar)binary/bip j class ing(represent to from
tsdown weigh top: value)(real to from weightsup bottom :
ij
ji
ji
ij
xyt
yxb
units control : , G1, G2R
+
+++
+
--
+ G1
)(1 bF
ni xxx 1
• cluster units: competitive, receive input vector x through weights b: to determine winner j.
• input units: placeholder or external inputs
• interface units:
– pass s to x as input vector for classification by
– compare x and
– controlled by gain control unit G1
•
• Needs to sequence the three phases (by control units G1, G2, and R)
2F
) winner fromn (projectio jj yt
1G
)(1 aF
)(1 bF
2F
1) are inputs threethe of twoif 1(output rule 2/3obey and )(both in Nodes 21 FbF
RGtFtGsbF jijii ,2, : Input to ,1, :)( Input to 21
JtbFGsbFG
ysG
for open )( :0 0 receive open to )( :1
otherwise0 0and0if1
11
11
1
parametervigilance1otherwise1
if0
o
s
xR
input new afor tionclassifica new a ofstart thesignals 1
otherwise00if1
2
2
G
sG
R = 0: resonance occurs, update and
R = 1: fails similarity test, inhibits J from further computationJtJb
Working of ART1
• Initial state: nodes on set to zeros
• Recognition phase: determine the winner cluster for input s21 and)( FbF
JbxF
sxRsbF
sGsyGaFs
j
winnerdeterminereceivetoopenis
)1/(0 receivetoopenis)(
)0(1 )0,0(1
(clamped)therestayand)(toappliedis0
2
1
2
1
1
01 cluster toclassifiedly tentativeis
JkJ YYJx
• Comparison phase:
match possibleother for search phasen recognitio back to goes
zero set to is other all disabledy permanentl is
) to (fromsent is signalreset rejected istion classifica
1/ifaccepted istion classifica theoccurs, resonance a
0/if
)rule 2/3(:)(on appears new)0(0
fromdownsentis
2
1
1
2
k
J
Jiii
J
yy
FR
Rsx
Rsx
tsxbFxyG
Ft
• Weight update/adaptive phase– Initial weight: (no bias)
bottom up:top down:
– When a resonance occurs with
– If k sample patterns are clustered to node then = pattern whose 1’s are common to all these k samples
2)usually ( )1/()0(0 LnLLbij
1)0( jit
JJJ tby andupdate,
Jiii
J
tsxbFtsx
aFs
))((on and between result comparison :
))((on input current :
1
1
xL
Lxbxt i
ijiJi
1 new new
.Jy
Jt
JJJ
iiJ
J
tbxtsxb
kssst
normalizedais,0ifonly0iff0)new(
)().....2()1(
– Winner may shift:
3
1)0111()1(
)0001()0001()3()0011()0011()2(
)0111( )0111()1(2.0)0(,2
1
1
1
s
xs
tststs
bL ij
– What to do when failed to classify into any existing cluster?
– report failure/treat as outlier
– add a new cluster node
1,1 ,11,
immi tnL
Lb
with1my
Notes1. Classification as a search process
2. No two classes have the same b and t
3. Different ordering of sample input presentations may result in different classification.
4. Increase of increases # of classes learned, and decreases the average class size.
5. Classification may shift during search, will reach stability eventually.
6. ART2 is the same in spirit but different in details.