Hebbian Based Learning With Winner- Take-All for - Personal Psu

15
Hebbian Based Learning With Winner- Take-All for Spiking Neural Networks Lyle Long Distinguished Professor of Aerospace Engineering, Bioengineering, and Mathematics Ankur Gupta Ph.D. Candidate, Computer Science The Pennsylvania State University, University Park Presented at the American Physical Society Meeting Pittsburgh, PA, March, 2009

Transcript of Hebbian Based Learning With Winner- Take-All for - Personal Psu

Hebbian Based Learning With Winner-Take-All for Spiking Neural Networks

Lyle LongDistinguished Professor of Aerospace Engineering, Bioengineering, and

Mathematics

Ankur GuptaPh.D. Candidate, Computer Science

The Pennsylvania State University, University Park

Presented at the American Physical Society MeetingPittsburgh, PA, March, 2009

Spiking Neural Networks• Real cortical neurons communicate

using spikes or action potentials.

(roughly 1011 neurons and 1015

synapses in human brain)

• Computing with precisely timed spikes

is more powerful than “rates”

(W. Maass and M. Schmitt., 1999)

• Added advantage of ability to

continuously process information (can

deal with time dependent data).

Integrate & Fire Neuron Model

Constant current

(wij)

Koch, Biophysics of Computation, 1999

Hebbian Learning• Learning:

•“When an axon of cell A is near enough to excite cell B and repeatedlyor persistently takes part in firing it, some growth process or metabolicchange takes place in one or both cells such that A's efficiency, as one ofthe cells firing B, is increased.”

• Forgetting:

•“The old, long-established memory would then last, not reversible exceptwith pathological processes in the brain; less strongly establishedmemories would gradually disappear unless reinforced.”

• D. O. Hebb, The Organization of Behavior, 1949

Time-Dependent Plasticity

ExperimentallyExperimentallyobservedobserved

Verified by bothVerified by bothin-vitro and in-in-vitro and in-

vivo studiesvivo studies

Less Delay =>Less Delay =>more correlationmore correlation

STDP Algorithm

• Not efficient• Not Stable

Ref: Bi & Poo, 1998

Learning• We have developed and implemented a

Hebbian based learning method

• Approach is spike-time dependent, but isnot STDP

• Our approach is scalable, have run a billionsynapses on a laptop

• Winner-take-all is also implemented forcompetitive learning

• Homeostasis ensures stabilityRefs: Long, 2008 and Long and Gupta, 2008

Hubel & Weisel• Studies with cats

and primates

• Measured signalsin striate cortex

• Neurons incortexresponded tobars oriented inkey directions

Ref. Hubel & Weisel, 1959 (Nobel Prize 1981)

Test Case

Input imageInput image28x2828x28

Network architecture: Spiking neural network784 inputs & 4 outputs

for WTAfor WTA

NN Results

First .4 sec of simulationFirst .4 sec of simulation

Trained on 4 GaborTrained on 4 Gaborfilters presentedfilters presented

cyclically for 20 seccyclically for 20 sec

A different GaborA different Gaborfilter every 50msfilter every 50ms

Output Neuron 4 Output Neuron 3

Output Neuron 1 Output Neuron 2

Results

Tuning curvesTuning curves

• Each color adifferent neuron

• Trained on 4Gabor filters

• Tested on 36Gabor filters insteps of 5 deg

Conclusions

• Efficient, scalable learning algorithm

• Spiking neural network learned to detectlines of different orientation

• Similarities to Hubel & Weisel experiments

Image from our NN Simulationson APS Brochure and Website

Neurons

Time

Spiking Neural Network

References• C. Koch, Biophysics of Computation: Information Processing in Single Neurons:

Oxford Press, 1999.

• D. O. Hebb, The Organization of Behavior: A Neuropsychological Theory: ErlbaumPub., 1949.

• G. Bi and M. Poo, "Synaptic Modifications in Cultured Hippocampal Neurons:Dependence on Spike Timing, Synaptic Strength, and Postsynaptic Cell Type,"Journal of Neuroscience, vol. 18, pp. 10464-10472, 1998.

• D. H. Hubel and T. N. Wiesel, “Receptive fields of single neurones in the cat'sstriate cortex,” J. Physiol. 1959, vol. 48, pp. 574-591

• L. N. Long and A. Gupta, "Scalable massively parallel artificial neural networks,"Journal of Aerospace Computing, Information, and Communication, vol. 5, Jan.,2008.

• Long, Lyle N., "Scalable Biologically Inspired Neural Networks with Spike TimeBased Learning," Invited Paper, IEEE Symposium on Learning and AdaptiveBehavior in Robotic Systems, Edinburgh, Scotland, Aug. 6-8, 2008.

• Long, Lyle N. and Gupta, Ankur, "Biologically-Inspired Spiking Neural Networkswith Hebbian Learning for Vision Processing," AIAA Paper No. 2008-0885,presented at the 46th AIAA Aerospace Sciences Meeting, Reno, NV, Jan. 7-10,2008.

Questions?

Lyle Long

LNL @ PSU.EDU

http://www.personal.psu.edu/lnl