3.2. Neurons and their networks 3.2.1 Biological neurons Tasks such as navigation, but also...

Post on 21-Dec-2015

219 views 0 download

Tags:

Transcript of 3.2. Neurons and their networks 3.2.1 Biological neurons Tasks such as navigation, but also...

3.2. Neurons and their networks3.2.1 Biological neurons

Tasks such as navigation, but also cognition, memory etc. happen in the nervous system (more specifically the brain).

The nervous system is made up of several different types of cells:

- Neurons

- Astrocytes

- Microglia

- Schwann cells

Neurons do the computing, the rest is infrastructure

Astrocytes

Star-shaped, abundant, and versatile

Guide the migration of developing neurons

Act as K+ and NT buffers

Involved in the formation of the blood brain barrier

Function in nutrient transfer

Microglia

Specialized immune cells that act as the macrophages of the central nervous system

Schwann cells and Oligodendrocytes

Produce the myelin sheath which provides the electrical insulation for neurons and nerve fibers

Important in neuronal regeneration

Myelination – electrically insulates the axon, which increases the transport speed of the action potential

Types of neurons

SensoryNeuron

Motor Neuron

Brain

Lots of interneurons

What they look like

...or schematically

In fact, things are a bit more crowded

Neurons communicate with each other, we will see later how this works. This will be the "neural network"

Thus, neurons need to be able to conduct information in 2 ways:

1.From one end of a neuron to the other end.This is accomplished electrically via action potentials

2.Across the minute space separating one neuron from another. This is accomplished chemically via neurotransmitters.

Cell Membrane at rest

Na+ Cl-K+

Na+

Cl-K+ A-

Outside of Cell

Inside of Cell

Potassium (K+) can pass through to equalize its concentration

Sodium and Chlorine cannot pass through

Result - inside is negative relative to outside

- 70 mV

Resting potential of neurons

Now lets open a Na channel in the membrane...

If the initial amplitude of the GP is sufficient, it will spread all the way to the axon hillock where V-gated channels reside. At this point an action potential can be excited if the voltage is high enough.

N.B. The gating properties of ion channels were determined long before it was known they existed from electrical measurements (conductivity of squid axons to Na and K)

Similar for the transport of K – the different coefficients imply the number of opening and gating bits...

With modern crystalography, these effects have been observed...

Transport of the action potential, like a row of dominos falling...

This goes a lot faster with myelinated axons – saltating transport...

Once at the syapse, the signal is transmitted chamically via neurotransmitters (e.g. Acetylcholin) These are then used to excite a new graded potential in the next neuron

This graded potential can be both positive and negative, depending on the environment

The intensity of the signal is given by the firing frequency

These properties are caricatured in the McCulloch-Pitts neuron

Learning happens when the weights wij are changed in response to the environment – this needs an updating rule

Common in informatics is the iterative learning, which needs a teacher. I.e. The weights are adjusted so that in every learning step, the distance to the correct answer is obtained.

This is known as the perceptron

Inputlayer

Firsthiddenlayer

Secondhiddenlayer

Outputlayer

O u

t p

u t

S

i g n

a l

s

I n

p u

t S

i g

n a

l sWith the use of hidden layers, not linearly separable variable can be learnt...

An example: letter recognition

The problems that can be solved depend on the structure of the network

3.2.2 Hebbian learning

This means that a synapse gets stronger as

neighbouring cells are more correlated

Hebb’s Law can be represented in the form of Hebb’s Law can be represented in the form of two rules:two rules:

1. If two neurons on either side of a connection 1. If two neurons on either side of a connection are activated synchronously, then the weight of are activated synchronously, then the weight of that connection is increased. that connection is increased.

2. If two neurons on either side of a connection 2. If two neurons on either side of a connection are activated asynchronously, then the weight of are activated asynchronously, then the weight of that connection is decreased.that connection is decreased.

Hebb’s Law provides the basis for learning Hebb’s Law provides the basis for learning without a teacher. Learning here is a without a teacher. Learning here is a local local phenomenonphenomenon occurring without feedback occurring without feedback from the environment.from the environment.

i j

I n

p u

t S

i g

n a

l s

O u

t p

u t

S i

g n

a l s

Hebbian learning in a neural network

)( )( )( pxpypw ijij

A Hebbian Cell Assembly

By means of the Hebbian Learning Rule, a circuit of continuously firing neurons could be learned by the network.

The continuing activation in this cell assembly does not require external input.

The activation of the neurons in this circuit would correspond to the perception of a concept.

A Cell Assembly

Input from the environment

A Cell Assembly

Input from the environment

A Cell Assembly

Input from the environment

A Cell Assembly

Input from the environment

Note that the input from theenvironment is gone...

A Cell Assembly

A Cell Assembly

Hebbian learning implies that weights can only Hebbian learning implies that weights can only increase. To resolve this problem, we might increase. To resolve this problem, we might impose a limit on the growth of synaptic impose a limit on the growth of synaptic weights. It can be done by introducing a non-weights. It can be done by introducing a non-linear linear forgetting factor into Hebb’s Law: into Hebb’s Law:

wherewhere is the forgetting factor.is the forgetting factor.

The fThe forgetting factor usually falls in the orgetting factor usually falls in the interval between 0 and 1, typically between interval between 0 and 1, typically between 0.01 and 0.1, to allow only a little “forgetting” 0.01 and 0.1, to allow only a little “forgetting” while limiting the weight growth.while limiting the weight growth.

)( )( )( )( )( pwpypxpypw ijjijij

First simulation of Hebbian learning

• Rochester et al. attempted to simulate the emergence of cell assemblies in a small network of 69 neurons. They found that everything became active in their network.

• They decided that they needed to include inhibitory synapses. This worked and cell assemblies did, indeed, form.

• This was later confirmed in real brain circuitry.

In fact, these inhibitory connections are distance dependent and as such

give rise to structure

Exciation happens within columns and inhibition further away

Connectionstrength

Distance

Excitatoryeffect

Inhibitoryeffect

Inhibitoryeffect

0

1

Long range inhibition and short range activation gives rise to patterns

See also the excursion into pattern formation in Sec 3.6

Input layer

Kohonen layer

(a)

Input layer

Kohonen layer

1 0(b)

0 1

Feature mapping Kohonen model

ncompetitio theloses neuron if ,0

ncompetitio the winsneuron if ),(

j

jwxw

ijiij

Set initial synaptic weights to small random values, say in an interval [0, 1], and assign a small positive value to the learning rate parameter .

Competitive learning

,)()()(

2/1

1

2][

n

iijij

jpwxpminpj WXX

)()()1( pwpwpw ijijij Update weights:

j(p) is the neighbourhood function centred around jX

Iterate...

To illustrate competitive learning, consider the Kohonen network with 100 neurons arranged in the form of a two-dimensional lattice with 10 rows and 10 columns. The network is required to classify two-dimensional input vectors each neuron in the network should respond only to the input vectors occurring in its region.

The network is trained with 1000 two-dimensional input vectors generated randomly in a square region in the interval between –1 and +1. The learning rate parameter is equal to 0.1.

-0.8 -0.6 -0.4 -0.2 0 0.2 0.4 0.6 0.8 1W(1,j)

-1-1

-0.8

-0.6

-0.4

-0.2

0

0.2

0.4

0.6

0.8

1W

(2,j)

Initial random network

-0.8 -0.6 -0.4 -0.2 0 0.2 0.4 0.6 0.8 1W(1,j)

-1-1

-0.8

-0.6

-0.4

-0.2

0

0.2

0.4

0.6

0.8

1W

(2,j)

After 100 steps

-0.8 -0.6 -0.4 -0.2 0 0.2 0.4 0.6 0.8 1W(1,j)

-1-1

-0.8

-0.6

-0.4

-0.2

0

0.2

0.4

0.6

0.8

1W

(2,j)

After 1000 steps

-0.8 -0.6 -0.4 -0.2 0 0.2 0.4 0.6 0.8 1W(1,j)

-1-1

-0.8

-0.6

-0.4

-0.2

0

0.2

0.4

0.6

0.8

1W

(2,j)

After 10000 steps

Or for letter recognition

In the cortex, this gives rise to the homunculus, the spatial distribution of nerve cells responsible for senses

Similar for other features in the cortex

3.2.3 Associative networks

xi

x1

x2

xnI n

p u

t

S i

g n

a l

s

yi

y1

y2

yn

1

2

i

n

O u

t p

u t

S

i g

n a

l s

In a Hopfield Network, every neuron

is connected to every other neuron

y1

y2

y3

(1, 1, 1)( 1, 1, 1)

( 1, 1, 1) (1, 1, 1)

(1, 1, 1)( 1, 1, 1)

(1, 1, 1)( 1, 1, 1)

0

Topological state analysis for a three neuron Hopfield network

IYYW MM

m

Tmm

1

The stable state-vertex is determined by the weight matrix W, the current input vector X, and the threshold matrix . If the input vector is partially incorrect or incomplete, the initial state will converge into the stable state-vertex after a few iterations.

))(sgn()1( ijj iji tSwtS

Energy function of Hopfield net: multidimensional landscape

jiji SSwH ,2

1

pj

piij vvw

Example: Restoring corrupted memory patterns

Original T Half is corrupted

20% of T corrupted

Recap Sec. 3.2

The brain is a network of neurons, whose properties are important in how we learn

Within neurons, signals are transported electrically, between chemically

This can be abstracted in a McCulloch Pitts neuron

Hebbian learning makes strong connections stronger (leads to pattern formation)

This is taken further in Kohonen networks and competitive learning