Stability and statistical properties of second-order bidirectional associative memory

11
IEEE TRANSACTIONS ON NEURAL NETWORKS, VOL. 8, NO. 2, MARCH 1997 267 Stability and Statistical Properties of Second-Order Bidirectional Associative Memory Chi-Sing Leung, Lai-Wan Chan, Member, IEEE, and Edmund Man Kit Lai Abstract— In this paper, a bidirectional associative memory (BAM) model with second-order connections, namely second- order bidirectional associative memory (SOBAM), is first re- viewed. The stability and statistical properties of the SOBAM are then examined. We use an example to illustrate that the stability of the SOBAM is not guaranteed. For this result, we cannot use the conventional energy approach to estimate its memory capacity. Thus, we develop the statistical dynamics of the SOBAM. Given that a small number of errors appear in the initial input, the dynamics shows how the number of errors varies during recall. We use the dynamics to estimate the memory capacity, the attraction basin, and the number of errors in the retrieved items. Extension of the results to higher-order bidirectional associative memories is also discussed. Index Terms—Associative memory, BAM, neural network, sta- bility. I. INTRODUCTION A SSOCIATIVE memories [1], [2] have been intensively studied in the past decade. An important feature of asso- ciative memories is the ability to recall the stored items from partial or noisy inputs. One form of associative memories is the bivalent additive bidirectional associative memory (BAM) [3]. It is a two-layer heteroassociator that stores a prescribed set of vector pairs. We will refer to these pairs as pattern pairs. A BAM network is very similar to a Hopfield network but has two layers of neurons in which layer has neurons and layer has neurons. The recall process of the BAM is an iterative one starting with a stimulus pair in and . After a number of iterations, the patterns in and converge to a fixed point which is desired to be one of the pattern pairs. The BAM has three important features [3]. First, it performs both heteroassociative and autoassociative data recalls: the final state in represents the autoassociative recall, while the final state in represents the heteroassociative recall. Second, the initial input can be presented in any one of the two layers. Last, the BAM is stable during recall. To encode the pattern pairs, Kosko used the outer-product rule [3]. However, with the outer-product rule the memory capacity is very small if the pattern pairs are not orthogonal. Several modifications have been proposed to improve the memory capacity. These modifications fall into two categories: Manuscript received September 19, 1994; revised May 31 1995, May 19, 1996, and October 18, 1996. C.-S. Leung and L.-W. Chan are with Department of Computer Science and Engineering, The Chinese University of Hong Kong, Shatin, N.T., Hong Kong. E. M. K. Lai is with Department of Computer and Communication Engineering, Edith Cowan University, Joondalup Campus, Perth, Western Australia. Publisher Item Identifier S 1045-9227(97)01748-7. 1) modifying the encoding methods [4]–[6] and 2) introducing second-order connections to form the second-order bidirec- tional associative memory (SOBAM) [7]–[9]. The memory capacity of the SOBAM has been empirically studied [7] but the theoretical memory capacity has not yet been derived. The SOBAM has also been proven to be stable during recall [7]. This paper describes the stability and statistical properties of the SOBAM. Contrary to Simpson’s works [7], we demon- strate that the stability of the SOBAM is not guaranteed during recall. We also point out a mistake in [7]. This mistake has led to the wrong conclusion that the stability of the SOBAM is guaranteed. Hence, we cannot use the energy approach [10] to estimate the statistical properties of the SOBAM, especially the memory capacity and the attraction basin. In this paper, we are interested in knowing whether each pattern pair can attract all the initial inputs within a certain distance from it. If so, we can obtain the attraction basin. Another important performance index is memory capacity, i.e., the maximum number of pattern pairs that can be stored in the SOBAM as attractors. Also of interest is the number of errors in the retrieved pairs. The question now is: given any errors in the initial input (an arbitrary error pattern with errors in the initial input), how does the number of errors vary during recall? To answer this question, we develop the statistical dynamics of the SOBAM. From this dynamics, the number of errors in the retrieved pairs, the attraction basin, and the memory capacity can be estimated. Section II reviews the SOBAM and discusses its stability. The statistical dynamics of the SOBAM is developed in Section III, using the theory of large deviation [10]. Section IV discusses the way to estimate the memory capacity, the attrac- tion basin, and the number of errors in the retrieved items. Numerical examples are given in Section V. Section VI shows how the results can be generalized to higher order bidirectional associative memories (HOBAM’s). II. SOBAM AND STABILITY There are pattern pairs , where and . The components of and are bipolar ( 1 or 1). The SOBAM encodes the pattern pairs in two matrices. The first matrix, , is a lattice that holds the second-order connections from to . The second matrix, , is a lattice that holds the second-order connections from to . The matrix is given by and (1) 1045–9227/97$10.00 1997 IEEE

Transcript of Stability and statistical properties of second-order bidirectional associative memory

Page 1: Stability and statistical properties of second-order bidirectional associative memory

IEEE TRANSACTIONS ON NEURAL NETWORKS, VOL. 8, NO. 2, MARCH 1997 267

Stability and Statistical Properties of Second-OrderBidirectional Associative MemoryChi-Sing Leung, Lai-Wan Chan,Member, IEEE,and Edmund Man Kit Lai

Abstract— In this paper, a bidirectional associative memory(BAM) model with second-order connections, namely second-order bidirectional associative memory (SOBAM), is first re-viewed. The stability and statistical properties of the SOBAMare then examined. We use an example to illustrate that thestability of the SOBAM is not guaranteed. For this result, wecannot use the conventional energy approach to estimate itsmemory capacity. Thus, we develop the statistical dynamicsof the SOBAM. Given that a small number of errors appearin the initial input, the dynamics shows how the number oferrors varies during recall. We use the dynamics to estimate thememory capacity, the attraction basin, and the number of errorsin the retrieved items. Extension of the results to higher-orderbidirectional associative memories is also discussed.

Index Terms—Associative memory, BAM, neural network, sta-bility.

I. INTRODUCTION

A SSOCIATIVE memories [1], [2] have been intensivelystudied in the past decade. An important feature of asso-

ciative memories is the ability to recall the stored items frompartial or noisy inputs. One form of associative memories isthe bivalent additive bidirectional associative memory (BAM)[3]. It is a two-layer heteroassociator that stores a prescribedset of vector pairs. We will refer to these pairs aspattern pairs.A BAM network is very similar to a Hopfield network but hastwo layers of neurons in which layer has neurons andlayer has neurons. The recall process of the BAM is aniterative one starting with a stimulus pair in and . Aftera number of iterations, the patterns in and converge toa fixed point which is desired to be one of the pattern pairs.

The BAM has three important features [3]. First, it performsboth heteroassociative and autoassociative data recalls: thefinal state in represents the autoassociative recall, whilethe final state in represents the heteroassociative recall.Second, the initial input can be presented in any one of thetwo layers. Last, the BAM is stable during recall.

To encode the pattern pairs, Kosko used the outer-productrule [3]. However, with the outer-product rule the memorycapacity is very small if the pattern pairs are not orthogonal.Several modifications have been proposed to improve thememory capacity. These modifications fall into two categories:

Manuscript received September 19, 1994; revised May 31 1995, May 19,1996, and October 18, 1996.

C.-S. Leung and L.-W. Chan are with Department of Computer Scienceand Engineering, The Chinese University of Hong Kong, Shatin, N.T., HongKong.

E. M. K. Lai is with Department of Computer and CommunicationEngineering, Edith Cowan University, Joondalup Campus, Perth, WesternAustralia.

Publisher Item Identifier S 1045-9227(97)01748-7.

1) modifying the encoding methods [4]–[6] and 2) introducingsecond-order connections to form the second-order bidirec-tional associative memory (SOBAM) [7]–[9]. The memorycapacity of the SOBAM has been empirically studied [7] butthe theoretical memory capacity has not yet been derived. TheSOBAM has also been proven to be stable during recall [7].

This paper describes the stability and statistical propertiesof the SOBAM. Contrary to Simpson’s works [7], we demon-strate that the stability of the SOBAM is not guaranteed duringrecall. We also point out a mistake in [7]. This mistake hasled to the wrong conclusion that the stability of the SOBAMis guaranteed. Hence, we cannot use the energy approach [10]to estimate the statistical properties of the SOBAM, especiallythe memory capacity and the attraction basin. In this paper,we are interested in knowing whether each pattern pair canattract all the initial inputs within a certain distance from it.If so, we can obtain the attraction basin. Another importantperformance index is memory capacity, i.e., the maximumnumber of pattern pairs that can be stored in the SOBAMas attractors. Also of interest is the number of errors in theretrieved pairs. The question now is: given any errors inthe initial input (an arbitrary error pattern with errors inthe initial input), how does the number of errors vary duringrecall? To answer this question, we develop the statisticaldynamics of the SOBAM. From this dynamics, the numberof errors in the retrieved pairs, the attraction basin, and thememory capacity can be estimated.

Section II reviews the SOBAM and discusses its stability.The statistical dynamics of the SOBAM is developed inSection III, using the theory of large deviation [10]. Section IVdiscusses the way to estimate the memory capacity, the attrac-tion basin, and the number of errors in the retrieved items.Numerical examples are given in Section V. Section VI showshow the results can be generalized to higher order bidirectionalassociative memories (HOBAM’s).

II. SOBAM AND STABILITY

There are pattern pairs , whereand . The

components of and are bipolar ( 1 or 1). TheSOBAM encodes the pattern pairs in two matrices. The firstmatrix, , is a lattice that holds the second-orderconnections from to . The second matrix, , is a

lattice that holds the second-order connections fromto . The matrix is given by

and

(1)

1045–9227/97$10.00 1997 IEEE

Page 2: Stability and statistical properties of second-order bidirectional associative memory

268 IEEE TRANSACTIONS ON NEURAL NETWORKS, VOL. 8, NO. 2, MARCH 1997

The matrix is given by

and

(2)The state at time is denoted as .The state at time is denoted as .The recall process is

sgn (3)

for , where

sgnstate unchanged

Similarly

sgn (4)

for . Equations (3) and (4) imply that the initialinput recalls ; recalls and so on.

The SOBAM is a finite-state autonomous system whosestate converges to either a stable state or a limit cycle.Unlike the original BAM, the stability of the SOBAM is notguaranteed during recall. To illustrate this, we use a SOBAMnetwork to store the following pattern pairs:

With the initial state and, the following states can be ob-

tained, shown in the equation at the bottom of the page.Clearly, the network converges to a limit cycle. Thus, the

stability of the SOBAM is not guaranteed.Simpson [7] used an energy function to explain the stability

of the SOBAM. The energy function is expressed as

(5)

where are the current states,represents the energy, and

represents the energy. Accordingto the recall process, either or is updated first. Ifis updated first, the change in energy is

(6)

where and is the new state in .Conversely, if is updated first, the change in energy is

(7)

where and is the new state in .Simpson showed that the values of and are

either negative or zero. He then claimed that the SOBAMis always stable [7]. However, from our previous counterexample, it can be seen that the stability is not guaranteed.This discrepancy is due to the omission of some terms on theright-hand side of (6) and (7). Actually, if is updated first,the total change in energy is

(8)

sgnsgnsgnsgnsgnsgnsgnsgnsgnsgnsgnsgn

......

......

...

Page 3: Stability and statistical properties of second-order bidirectional associative memory

LEUNG et al.: SECOND-ORDER BIDIRECTIONAL ASSOCIATIVE MEMORY 269

In this equation, the first term on the right-hand side is thechange in due to a change of the state. The othertwo terms, which represent the change in due to a changeof the state, are either negative or positive. Hence, wecannot draw any conclusion regarding the stability based onthe energy function proposed by Simpson [7].

The above discussion is valid for layer-synchronous re-call process in which all neurons in a layer are updatedsimultaneously. Since layer-synchronous recall process is aspecial case of asynchronous recall processes whereby theneurons in a layer are updated sequentially, the stability ofthe SOBAM is not guaranteed under both layer-synchronousand asynchronous recall processes.

III. STATISTICAL DYNAMICS

A. Notations and Outline

This section outlines how the statistical dynamics is derived.We first define some terminologies and state the assumptionsused in the rest of the paper.

• , where is a positive constant.• , where is a positive constant.• The dimensions, and , are large. This assumption is

often used [10]–[15].• For analytical purposes, we assume that each component

of the pattern pairs is a equiprobable independentrandom variable. Though this assumption is not alwaysbeing satisfied in most real-life data, it is difficult toanalyze associative memories without making such anassumption. In fact, this assumption has been widely used[10]–[15].

• The Hamming distance between two bipolar vectors,and , is denoted as .

• Attraction Basin: It is required that each pattern pair isstored as a stable state (or at least there is a stable state ata small Hamming distance). Otherwise, the pattern pairscannot be recalled. Besides, we expect a SOBAM networkto have the following error correction property. If thenetwork is started at a state where

, the state will reach a stable state withina distance of from the stored pattern after asequence of state transitions where (thestate should also reach a stable state within a distance of

from the stored pattern where ). We areinterested in knowing whether each pattern pair is able toattract all the initial inputs within a distance offor some positive constants . The maximum value ofsuch denotes the attraction basin of each pattern pair.Also of interest is the number of errors in the retrieveditems. This number measures the quality of the retrieveditems. Since we are considering “all possible initial inputswithin a certain distance,” the above definition of theattraction basin is forworst case errors. In the rest of thepaper, the term “attraction basin” refers to the attractionbasin for worst case errors. Instead of estimating theattraction basin directly, we will estimate the number oferrors after each state transition.

• Given that and is the probabilitythat for each pattern pair and for any errorpattern with errors in in the present state( ), the number of errors in inthe next state is less than ( ). Itshould be noticed that the phrase “for any error patternwith errors in ” in the definition of reflectsthe concept ofworst case errors.

• Given that and , is the probabilitythat for each pattern pair and for any errorpattern with errors in in the present state( ), the number of errors in inthe next state is less than ( ).

The error rate inThe number of errors in

The error rate inThe number of errors in

• To estimate the value of , we first introduce the event. It is the event that

for a given pattern pair and for a given presentstate , where

such that (9)

The index refers to a particular error pattern. For agiven , the number of error patterns is . Thus,

the range of is from one to . Also, is thecomplement event of . It is the event that

for a given pattern pair and for a given presentstate .

• In the above, each event only refers to an errorpattern and a pattern pair. To consider each pattern pairand all the possible error patterns, we need to introducethe event which is the intersection of all possible

’s

It is the event that

for each pattern pair and for any .Also, is defined as the complement event of

Page 4: Stability and statistical properties of second-order bidirectional associative memory

270 IEEE TRANSACTIONS ON NEURAL NETWORKS, VOL. 8, NO. 2, MARCH 1997

From the definitions of and

Prob

Prob

Prob

Prob (10)

In Part B, we will first estimate the values of and(Lemmas 4 and 5). From the two lemmas, an upper bound onthe error rate in the next state is obtained (Corollaries 2 and4). Based on Corollary 2, given that is the upper bound onthe error rate in at time , we can derive an upper bound

on the error rate in at time . Similarly, fromCorollary 4, we can estimate from . As a result,two sequences and are constructed to representthe statistical dynamics of the SOBAM. In Section IV, wewill discuss how to use the features of these two sequencesto estimate the memory capacity, the number of errors in theretrieved pairs, and the attraction basin.

B. Construction of the Dynamics

To estimate the values of and , we make use ofStirling’s formula and the theory of large deviation [10]. Here,we restate them as the following two lemmas.

Lemma 1—Stirling’s Asymptotic Formula for Factorial:Letbe a large integer and . Then

where

Lemma 2—Newman’s Lemma:Supposeare, for each , independent, identically distributed, andsymmetric random variables satisfying.

1)

Var (11)

2) For some real and ,

(12)

where is the expectation operator.

For any and

(13)

a sufficient condition for

Prob

(14)

as , is

(15)

From Lemmas 1 and 2, we first estimate a bound onProb .

Lemma 3:

Prob

provided that

for and .

Proof of Lemma 3:Without loss of generality, we as-sume that all the components of the pattern pair arepositive: and . Let bethe set of indexes at which and differ. For a given

, there is only one and . Let bea set of indexes of where . Note that there

are such sets. Event implies that there is at leastone such that

Hence

Prob

Prob there is at least one , where , such that

Prob for a given

(16)

Let

Prob for a given

(17)From (1), we have

Prob

(18)

Page 5: Stability and statistical properties of second-order bidirectional associative memory

LEUNG et al.: SECOND-ORDER BIDIRECTIONAL ASSOCIATIVE MEMORY 271

One can easily find that

and

Hence

Prob (19)

where

(20)

and

(21)

As and

and

By substituting Lemma 2 (put ) into (19), we obtain

(22)

provided that

for some real . From the Appendix, must be less than. To obtain a valid value of , we have

The procedures for checking whether satisfies the twoconditions (11) and (12) are provided in the Appendix. Bysubstituting (22) into (16), we obtain

Prob (23)

From Lemma 1, we have

Prob

Recall that

Prob (24)

From Lemma 3

(25)With Lemma 1, we can immediately obtain the bound onas Lemma 4 below.

Lemma 4:

(26)

provided that

(27)

Let be the minimum value of such that the right-handside of (26) tends to one (as ). From Lemma 4, isthe minimum value of such that

(28)

Let be the intersection of the line

(29)

and the curve

(30)

Then

where is an arbitrary small positive constant. Apparently,for a given can be solved numerically (seeFig. 1). Hence, for each pattern pair and for anyerror pattern with errors in ( ),the probability that the number of errors in is lessthan ( ) tends to one (as ).As can be any small positive constant, one can restate theabove statement as: the probability that the number of errorsin is less than or equal to ( )tends to one (as ). Furthermore, according to thefeature of and , the following corollary can be obtained.

Page 6: Stability and statistical properties of second-order bidirectional associative memory

272 IEEE TRANSACTIONS ON NEURAL NETWORKS, VOL. 8, NO. 2, MARCH 1997

Fig. 1. Graphical implication of solving�0

y.

Corollary 1: If , then .

Proof of Corollary 1: From (29) and (30) (see Fig. 1), ifthe value of is reduced, the line shifts upand its slope increases. Hence, the intersection shifts towardleft and a smaller is obtained.

The above corollary implies that a smaller is obtained

if the value of is reduced. Thus, given ,for each pattern pair ( ) and for any such that

(the error rate in in the present stateis less than or equal to ), the probability that the error ratein in the next state is less than or equal to tends to one(as ). We capture the above statement as Corollary 2.

Corollary 2: Given that is the intersection of and[see (29) and (30)], for each pattern pair ( ) and for

any such that , the probability thattends to one (as ).

It follows from Corollary 2 that, if the error rate in inthe present state is less than or equal to, the error ratein in the next state is less than or equal to. Thus,defines an upper bound on the error rate in in the nextstate. We denote this upper bound as .

Similarly, we can easily obtain the lower bound on asLemma 5.

Lemma 5:

(31)

provided that

(32)

From Lemma 5, we can estimate the error ratein inthe next state (given the error rate in in the present state)by considering the intersection of the line

(33)

and the curve

(34)

Hence, Corollaries 3 and 4 are obtained.Corollary 3: If , then .

Corollary 4: Given that is the intersection of and[see (33) and (34)], for each pattern pair ( ) and for any

such that , the probability thattends to one (as ).

Corollary 4 shows that if the error rate in in the presentstate is less than or equal to , the error rate in inthe next state is less than or equal to. Thus, definesan upper bound on the error rate in in the next state. Wedenote this upper bound as .

By solving (i.e., ) and (i.e., ) iteratively,

we can construct two sequences and . Fig. 2shows an example of them. These sequences ( )form the statistical dynamics of the upper bounds on the errorrates. Given arbitrary errors in the initial input, if thesequences ( ) converge to and , respectively

(where ), the noisy version of the desired pattern paircan be recalled. Besides, in the retrieved item, the number oferrors in and the number of errors in are less than

and , respectively. If a few errors are allowed in theretrieved pairs, we can use the above dynamics to estimatethe memory capacity, the attraction basin, and the number oferrors in the retrieved pairs.

IV. ESTIMATION OF STATISTICAL PROPERTIES

A. Memory Capacity

Clearly, if the sequences , with some ,converge to two small numbers individually, each pattern paircan attract all initial inputs within a certain distance. In otherwords, each pattern pair or its noisy versions can be stored asa stable state. Therefore, we can use the following method toestimate the memory capacity.

For a given , let be the maximum value of such thatthe sequences ( ) with some converge toand , respectively (where ). Thus, can beconsidered as a lower bound on the memory capacity.

Page 7: Stability and statistical properties of second-order bidirectional associative memory

LEUNG et al.: SECOND-ORDER BIDIRECTIONAL ASSOCIATIVE MEMORY 273

(a)

(b)

Fig. 2. The statistical dynamics of the SOBAM where� = 0:01 andr = 1.The initial conditions are�(0)x = 0 and 0:005. (a) The dynamics of�x(t)and (b) the dynamics of�y(t). Since all sequences converge, the attractionbasin at least equals0:005n.

B. Number of Errors in the Retrieved Items

As the sequences ( ) reflect the upper boundson the number of errors during recall, the final values of thesequences ( ) reflect the upper bounds on the number oferrors in the retrieved pairs.

C. Attraction Basin

Given arbitrary errors in the initial input, if thesequences converge to ( ) where , the desiredpattern pair can be recalled with no more than and

errors in and respectively. Hence, the maximum

value of , with which the sequences converge, correspondsto the lower bound on the attraction basin. We denote themaximum value as . The phrases “for any suchthat ” in Corollary 2 and “for anysuch that ” in Corollary 4 lead directlyto the fact that the estimated attraction basin refers toworstcase errors. In this case, we need to find out if the stored itemcan attract “every” initial input within a certain distance.

Another error correction index is the attraction basin forrandom errors. In this case, we need to find out if a storeditem can attract “an” initial input within a certain distance.The terms “an” and “every” mark the difference betweenworstcase errorsandrandom errors. It should be noticed that usingsimulations to study the attraction basin forworst case errorsis impractical. The reason is that the number of error patternsis very large (for example ). Simulationscan only reflect the properties of models in the presence ofrandom errors.

In the case ofrandom errors, if the number of the patternpairs is less than

(35)

an initial input (in ) within distance from will beattracted to the desired pattern pair in two recall steps withhigh probability. The proof of the above behavior is basedon the theory of large deviation (see [19, Lemma A.7]). Asthis paper is mainly concerned withworst case errors, thebehavior of the models in the presence ofrandom errorswillnot be discussed further.

D. Approximations of and

It is not difficult to see the following relationship betweenand .

Corollary 5: If the sequences ( ) converge tosmall and small , respectively, then

(36)

Proof of Corollary 5: From (29), (30), (33), and (34), wehave

Thus

As the values of and are small

Similarly, we can easily obtain Corollary 6, which can beused for the direct estimation of and .

Corollary 6: If the values of and are small ( ),we have

and (37)

(38)

Page 8: Stability and statistical properties of second-order bidirectional associative memory

274 IEEE TRANSACTIONS ON NEURAL NETWORKS, VOL. 8, NO. 2, MARCH 1997

TABLE ITHE LOWER BOUND ON THE MEMORY CAPACITY OF THE SOBAM

Proof of Corollary 6: From (28), we have

Using the approximation “ for small positive,” the corollary follows.Remark: Since we are concerned with the conditions under

which the lower bounds on and tend to one (as), we only obtain the bounds on the three statistical

properties: the memory capacity, the number of errors in theretrieved pairs and the attraction basin. The actual values ofthese three statistical properties considered in this section arebetter than the estimated bounds. It should also be noticedthat during the construction of the sequences ( ),we should check whether both conditions (27) and (32) aresatisfied.

V. NUMERICAL RESULTS

Numerical Example a:Based on the theoretical work pre-sented in the previous section, we estimate the lower bound onthe memory capacity. The estimated results are summarized inTable I. The lower bound increases withuntil whereit starts to decrease. Also

This symmetrical property means that inverting the ratio of thedimensions (interchange and ) does not affect the overallestimated lower bound.

One should be aware that when the values ofand aresmall, the bound may become meaningless. For example, if

and , the lower bound is about 1.28 (whichis less than ). However, for large and , the resultis different. For example, if and , the lowerbound is about (which is much greater than thedimension ). For image processing problems [17], thedimensions are usually greater than .

Fig. 3. The lower bound on the attraction basin wherer = 1; 2; 5; 10.

Numerical Example b:Fig. 3 summarizes the lower bounds on the attraction basin

at . When the value of is small, the lowerbound first increases asdecreases. However, when the valueof is too small, the lower bound starts to decrease asfurther decreases. This unnatural trend is due to the constraints(27) and (32), which limit the searching range of .However, it is rational to accept the claim that for a smaller

, a larger attraction basin is obtained. We take the maximumpoint in the figure as the lower bound on the attraction basinfor small values of . Table II summaries the above claim.From Fig. 3 and Table II, for a meaningful attraction basin,the dimension should be larger than . The estimatedattraction basin is quite small. This is not surprising becausethe estimated lower bounds refer toworst case error.

Figs. 4 and 5 show the behavior of and . Fromthe figures, the upper bound on the number of errors inthe retrieved pairs ( or ) decreases exponentially asdecreases (this feature matches Corollary 6). Also, the valueof is approximately equal to that of (as was indicatedin Corollary 5). Since it is desired that the number of retrievederrors should be as small as possible, the estimated upperbounds are more attractive.

VI. HOBAM’ S

Although we are mainly concerned with the propertiesof the SOBAM, we can apply a similar method to analyzeHOBAM’s. The only required change in the assumptions is

(39)

Page 9: Stability and statistical properties of second-order bidirectional associative memory

LEUNG et al.: SECOND-ORDER BIDIRECTIONAL ASSOCIATIVE MEMORY 275

Fig. 4. The upper bound on the error rate inFX in the retrieved pairs wherer = 1; 2; 5; 10.

where is a positive integer. For the-order BAM, theconnections from to are

(40)

where ; ; and . Theconnections from to are

(41)

where ; ; and . Thecorresponding recall equations are

sgn (42)

and

sgn

(43)

We can obtain similar results for the-order BAM based onthe following lemma.

Lemma 6: Let be equiprobable indepen-

dent random variables and where is a positiveinteger. As

(44)

Fig. 5. The upper bound on the error rate inFY in the retrieved pairs wherer = 1; 2; 5; 10.

TABLE IITHE LOWER BOUND ON THE ATTRACTION BASIN

OF THE SOBAM FOR SMALL VALUES OF �

Proof of Lemma 6:As , the distribution oftends to be normal. Since theth moment of a normal randomvariable [18] is

From Lemma 6, we can obtain the four corollaries for the-order BAM.

Corollary 7: For every pattern pair ( ) and for anysuch that , the probability that

tends to one (as ), provided that

Page 10: Stability and statistical properties of second-order bidirectional associative memory

276 IEEE TRANSACTIONS ON NEURAL NETWORKS, VOL. 8, NO. 2, MARCH 1997

(54)

where

and is the intersection of and

Corollary 8: For every pattern pair ( ) and for anysuch that , the probability that

tends to one (as ), provided that

where is the intersection of and

Corollary 9: If the sequences ( ) converge tosmall and small , respectively, then

(45)

Corollary 10: If the values of and are small ( ),we have

(46)

and

(47)

VII. CONCLUDING REMARKS

In this paper, we have studied several properties of theSOBAM. An example has been given which shows thatthe SOBAM may not be stable during recall. We have alsoderived the statistical dynamics of the SOBAM. Based onthis dynamics, we have estimated the memory capacity, theattraction basin and the number of errors in the retrievedpairs. Numerical examples have also been presented. Last,we have briefly discussed how our results can be extendedto HOBAM’s. One significant advantage of the methodologypresented is that we can analyze some associative memorieswhose stability is not guaranteed.

APPENDIX

VERIFICATION OF THE CONDITIONS OF NEWMAN’S LEMMA

Here, we show the conditions under which the randomvariable

satisfies (11) and (12), where ’s and ’s are equiprob-able independent random variables. Clearly,is symmetricand

(48)

Also, from Lemma 6

Var (49)

Hence, (11) is satisfied.To check whether satisfies (12), we use an existing

result about the sum of equiprobable independent randomvariables [19].

Lemma 7: Let be equiprobable indepen-dent random variables. For and large

(50)

where is the gamma function

(51)

The above lemma is part of [19, Lemma A.6]. From Lemma7, we have

(52)

Let , where is a positive integer and . Then

(53)where is a positive constant. For large

and

Hence we have (54), shown at the top of the page.For large , the th term of the sum

(55)

Page 11: Stability and statistical properties of second-order bidirectional associative memory

LEUNG et al.: SECOND-ORDER BIDIRECTIONAL ASSOCIATIVE MEMORY 277

decreases exponentially provided that and

. As converges to

it follows that

For the SOBAM, and .

REFERENCES

[1] T. Kohonen, “Correlation matrix memories,”IEEE Trans. Comput.,vol.21, pp. 353–359, 1972.

[2] G. Palm, “On associative memory,”Biol. Cybern.,vol. 36, pp. 19–31,1980.

[3] B. Kosko, “Bidirectional associative memories,”IEEE Trans. Syst.,Man, Cybern.,vol. 18, pp. 49–60, 1988.

[4] C. S. Leung, “Encoding method for bidirectional associative memoryusing projection on convex x sets,”IEEE Trans. Neural Networks,vol.4, pp. 879–881, Sept. 1993.

[5] , “Optimum learning for bidirectional associative memory inthe sense of capacity,”IEEE Trans. Syst., Man, Cybern.,vol 24, pp.791–796, May 1994.

[6] Y. F. Wang, J. B. Cruz, Jr., and J. H. Mulligan, Jr., “Two codingstrategies for bidirectional associative memory,”IEEE Trans. NeuralNetworks,vol. 1, pp. 81–92, 1990.

[7] P. K. Simpson, “Higher-ordered and intraconnected bidirectional as-sociative memories,”IEEE Trans. Syst., Man, Cybern.,vol. 20, pp.637–653, 1990.

[8] D. Psaltis, C. H. Park, and J. Hong, “Higher order associative mem-ories and their optical implementations,”Neural Networks,vol. 1, pp.149–163, 1988.

[9] J. M. Kinser, H. J. Caulfield, and J. Shamir, “Design for a massiveall-optical bidirectional associative: The big BAM,”Appl. Opt.,vol. 27,no. 16, pp. 3442–1344, 1988.

[10] C. M. Newman, “Memory capacity in neural models: Rigorous lowerbounds,”Neural Networks, vol. 1, pp. 223–238, 1988.

[11] J. Komlos and R. Paturi, “Convergence results in an associative memorymodel,” Neural Networks, vol. 1, pp. 229–250, 1988.

[12] R. J. McEliece, E. C. Posner, E. R. Rodemich, and S. S. Venkatesh,“The capacity of the hopfield associative memory,”IEEE Trans. Inform.Theory, vol. 33, pp. 461–482, 1987.

[13] S. Amari, “Mathematical foundations of neurocomputing,”Proc. IEEE,vol. 78, pp. 1443–1463, 1990.

[14] S. Amari and K. Maginu, “Statistical neurodynamics of associativememory,” Neural Networks,vol. 1, pp. 63–73, 1988.

[15] C. S. Leung, “The performance of dummy augmentation encoding,” inProc. IJCNN 93 Nagoya, vol. 3, 1993, pp. 2674–2677.

[16] K. Haines and R. Hecht-Nielsen, “A BAM with increased informationstorage capacity,” inProc. 1988 IEEE Int. Conf. Neural Networks, 1988,pp. 181–190.

[17] G. A. Kohring, “On the Q-state neuron problem in attractor neuralnetworks,”Neural Networks, vol. 6, pp. 573–581, 1993.

[18] A. Papoulis,Probability, Random Variables, and Stochastic Process.New York: McGraw-Hill, 1985.

[19] S. S. Venkatesh and P. Bald, “Programmed interactions in higher-orderneural networks: the outer-product algorithm,”J. Complexity, vol. 7,pp. 443–479, 1991.

Chi-Sing Leung received the B.Sc. degree in elec-tronics in 1989, the M.Phil. degree in informationengineering in 1991, and the Ph.D. degree in com-puter science and engineering in 1995, all from theChinese University of Hong Kong, Shatin, HongKong.

In 1995, he was a Research Fellow at the ChineseUniversity of Hong Kong. He was a Lecturer withthe School of Science and Technology at the OpenLearning Institute of Hong Kong. Currently, he isa Visiting Fellow at the department of Computer

Science, University of Wollongong, Australia. His research interests includelearning and modeling of artificial neural networks, and digital communica-tions.

Lai-Wan Chan (S’87–M’88) received the B.A. andM.A. degrees in electrical science and the Ph.D.degree in engineering from Cambridge University,Cambridge, U.K.

She joined the Computer Science and EngineeringDepartment of the Chinese University of HongKong, Shatin, Hong Kong, in 1989, and is currentlyand Associate Professor. Her research interests in-clude the learning and modeling of artificial neuralnetworks, in particular recurrent neural networksand applications of neural networks in time series

prediction, image recognition, and Cantonese speech recognition.Dr. Chan was Chairperson of the IEEE Computer Chapter in the Hong Kong

Section, is a Governing Board Member of the Asian-Pacific Neural NetworkAssembly, and was the Organizing Cochair of the International Conferenceon Neural Information Processing in 1996.

Edmund Man Kit Lai , photograph and biography not available at the timeof publication.