Neural modeling and simulation Romain Brette Ecole Normale Supérieure with romain.brette@ens.fr.

Post on 30-Mar-2015

215 views 2 download

Tags:

Transcript of Neural modeling and simulation Romain Brette Ecole Normale Supérieure with romain.brette@ens.fr.

Neural modeling and simulation

Romain Brette

Ecole Normale Supérieure

with

http://briansimulator.org

romain.brette@ens.fr

Spiking neuron models

Input =N spike trains

Output =1 spike train

A neuron model is defined by:• what happens when a spike is received• the condition for spiking• what happens when a spike is produced• what happens between spikes

discrete events

continuous dynamics

A neural network with

Pi Pe

Ce

CiP

from brian import *

eqs='''dv/dt = (ge+gi-(v+49*mV))/(20*ms) : voltdge/dt = -ge/(5*ms) : voltdgi/dt = -gi/(10*ms) : volt''‘

P=NeuronGroup(4000,model=eqs, threshold=-50*mV,reset=-60*mV)

P.v=-60*mV+10*mV*rand(len(P))Pe=P.subgroup(3200)Pi=P.subgroup(800)

Ce=Connection(Pe,P,'ge',weight=1.62*mV,sparseness=0.02)Ci=Connection(Pi,P,'gi',weight=-9*mV,sparseness=0.02)

M=SpikeMonitor(P)

run(1*second)

raster_plot(M)show()

Part I –Neurons

Equivalent electrical circuit

I

Linear approximation of leak current: I = gL(Vm-EL)

leak or resting potential

leak conductance = 1/R membrane resistance

= capacitance

EL -70 mV : the membrane is « polarized » (Vin < Vout)

The membrane equationIinj

Iinj

outside

inside

injLmm I

R

EV

dt

dVC

)(

=1/R

Vm

injmLm RIVE

dt

dV

RC membrane time constant(typically 3-100 ms)

tau = 10*msR = 50*MohmEL = -70*mVIinj = 0.5*nA

eqs='''dvm/dt = (EL-vm+R*Iinj)/tau : volt''‘

The integrate-and-fire model

spike threshold

action potential

PSP

« Integrate-and-fire »: RIVEdt

dVmL

m

If V = Vt (threshold)then: neuron spikes and V→Vr (reset)

(phenomenological description of action potentials)

« postsynaptic potential »

Current-frequency relationship1

log1

tL

rL

VRIE

VRIE

TF

from brian import *

N = 1000tau = 10 * mseqs = '''dv/dt=(v0-v)/tau : voltv0 : volt'''group = NeuronGroup(N, model=eqs, threshold=10 * mV, reset=0 * mV)group.v = 0 * mVgroup.v0 = linspace(0 * mV, 20 * mV, N)

counter = SpikeCounter(group)

duration = 5 * secondrun(duration)plot(group.v0 / mV, counter.count / duration)show()

Refractory period

Δ = refractory period

tL

rL

VRIE

VRIET

log from reset

1

log1

tL

rL

VRIE

VRIE

TF max 1/Δ

from brian import *

N = 1000tau = 10 * mseqs = '''dv/dt=(v0-v)/tau : voltv0 : volt'''group = NeuronGroup(N, model=eqs, threshold=10 * mV, reset=0 * mV, refractory=5 * ms)group.v = 0 * mVgroup.v0 = linspace(0 * mV, 20 * mV, N)

counter = SpikeCounter(group)

duration = 5 * secondrun(duration)plot(group.v0 / mV, counter.count / duration)show()

Synaptic currents

synaptic current

postsynaptic neuron

synapse

Is(t)

smLm RIVE

dt

dV

Idealized synapse Total charge Opens for a short duration Is(t)=Qδ(t)

sIQ

Dirac function

)(tRQVEdt

dVmL

m EL

t

Lm eRQ

EtV

)(

RQVV

VEdt

dV

mm

mLm

Spike-based notation:

at t=0=w « synaptic weight »

Example: fully connected networkfrom brian import *

tau = 10 * msv0 = 11 * mVN = 20w = .1 * mV

group = NeuronGroup(N, model='dv/dt=(v0-v)/tau : volt', threshold=10 * mV, reset=0 * mV)

W = Connection(group, group, 'v', weight=w)

group.v = rand(N) * 10 * mV

S = SpikeMonitor(group)

run(300 * ms)

raster_plot(S)show()

A more realistic synapse model Electrodiffusion: )( msss VEgI

ionic channel conductance

synaptic reversal potential

gs(t)

presynaptic spike

openopen closedclosed

))(( mssmLm VEtRgVE

dt

dV

« conductance-based integrate-and-fire model »

Example of kinetic model Stochastic transitions between

open and closed

C ⇄ Oα[L]

βopening rate, proportional to concentration

constant closing rate

Macroscopic equation (many channels):

xxLdt

dx )1]([ proportion of open channels

Assuming neurotransmitter are present for a very short duration:

ss

s gdt

dg

τs=1/β

gs(t)=x(t)*gmax

ss gg

Example of kinetic model

Post-synaptic effect:

ss

s

mssmLLm

gdt

dg

VEgVEgdt

dVC

)()(

τs=1/βIncoming spike: ss gg

Example: random network

taum = 20 * mstaue = 5 * mstaui = 10 * msEe = 0 * mVEi = -80 * mVEl = -60 * mV

eqs = '''dv/dt = (El-v+ge*(Ee-v)+gi*(Ei-v))/taum : voltdge/dt = -ge/taue : 1dgi/dt = -gi/taui : 1 ''‘

P = NeuronGroup(4000, model=eqs, threshold=10 * mvolt, \ reset=-60 * mvolt, refractory=5 * msecond)Pe = P.subgroup(3200)Pi = P.subgroup(800)we = 6. / 10. # excitatory synaptic weight (voltage)wi = 67. / 10. # inhibitory synaptic weightCe = Connection(Pe, P, 'ge', weight=we, sparseness=0.02)Ci = Connection(Pi, P, 'gi', weight=wi, sparseness=0.02)

P.v = (randn(len(P)) * 5 - 5) * mvoltP.ge = randn(len(P)) * 1.5 + 4P.gi = randn(len(P)) * 12 + 20

run(1 * second)

Pi Pe

Ce

CiP

(conductances in units of the leak conductance)

Linearization

Linear approximation:

)()( mii

imLLm VEgVEg

dt

dVC

non-linear

Limi EEVE 2

LTimi

EVEVE

ou

VT ≈ -50 mV

GABA-B(-100 mV)

GABA-A (-70 mV)

AMPA/NMDA(0 mV)

EL ≈ -70 mV

good

bad

ok

Example: random network

Pi Pe

Ce

CiP

from brian import *

eqs='''dv/dt = (ge+gi-(v+49*mV))/(20*ms) : voltdge/dt = -ge/(5*ms) : voltdgi/dt = -gi/(10*ms) : volt''‘

P=NeuronGroup(4000,model=eqs, threshold=-50*mV,reset=-60*mV)

P.v=-60*mV+10*mV*rand(len(P))Pe=P.subgroup(3200)Pi=P.subgroup(800)

Ce=Connection(Pe,P,'ge',weight=1.62*mV,sparseness=0.02)Ci=Connection(Pi,P,'gi',weight=-9*mV,sparseness=0.02)

M=SpikeMonitor(P)

run(1*second)

raster_plot(M)show()

currents

The postsynaptic potential Postsynaptic potential (PSP) = response to a

presynaptic spike for variable Vm(t).

)( Lii

imLm EEgVg

dt

dVC

),...,,,(

),...,,,(

21

21

niiii

ji

ji

niiiii

i

xxxgfdt

dx

xxxgfdt

dg

Spike at time t=0:

ini

ni wxx

Vm(t) = PPSi(t)synaptic variables

Temporal and spatial integration Response to a set of spikes {ti

j} ?

Linearity:

i = synapsej = spike number

ji

jiim ttPPStV

,

)()( Superposition principle

If the differential system is linear:

(example: the « spike response model »)

From integral to differential representation Experimental recordings = integral representation

model?

))/exp()/(exp()( 21 ttatVm

parametric estimation

biexponental

xdt

dx

Vxdt

dVm

m

2

1

(tool: Laplace transform)

xx

21

2

Voltage-gated channels: biophysics of spike initiation

depolarization(Vm ↑)

Na+ Cl-

K+

channels open:Na+ enters

channels inactivate:no current

Rest: Na+ channels are closed

repolarization(Vm ↓)

The sodium channels

Na+

Cl-K+

heterogeneous distribution of charges -> protein conformation can change with potential

Sodium enters when the « gate » is open

Two stable conformations:open and closed

State transitions

Na+

Cl-K+

closed → open

transition requires energy proportional to V

transition rate prop. to T

aV

e

(transition probability in [t,t+dt]

prop. to )dte T

aV

id. open → closed

transition rate prop. to T

bV

e

and ab<0

State transitions

C ⇄ Oα(V)β(V)

Macroscopic equation (many channels):

mVmVdt

dm)()1)((

opening rate

closing rate

m = proportion of open channels

Kinetic equation

mVmdt

dmVm )()(

mVmVdt

dm)()1)((

)()(

1)(

VVVm

time constant

)()(

)()(

VV

VVm

equilibrium value

k

VVVm

21

exp1

1)(

sigmoidal

The sodium current

mVmdt

dmVm )()(

)( VEmgI Na

max. conductance (= all channels open)

reversal potential(= 50 mV)

mVmdt

dmV

VEmgVEgdt

dVC

m

Nall

)()(

)()(

The Hodgkin-Huxley model

Model of the squid giant axon

Nobel Prize 1963

nVndt

dnV

hVhdt

dhV

mVmdt

dmV

VEngVEhmgVEgdt

dVC

n

h

m

KKNall

)()(

)()(

)()(

)()()( 43

the sodium channel has 3 independent « gates »

4 gates

The Hodgkin-Huxley model

Other voltage-dependent channels Other channels open depending on potential.

mVmdt

dmV

VEmgI

mmm

m

)()(

)(

max conductanceproportion of open channels

time constant

equilibrium value

Na+ (sodium)

K+ (potassium) – many different types

Ca2+ (calcium)

many other types of channels

Example: random network

eqs = '''dv/dt = (gl*(El-v)+ge*(Ee-v)+gi*(Ei-v)-\ g_na*(m*m*m)*h*(v-ENa)-\ g_kd*(n*n*n*n)*(v-EK))/Cm : volt dm/dt = alpham*(1-m)-betam*m : 1dn/dt = alphan*(1-n)-betan*n : 1dh/dt = alphah*(1-h)-betah*h : 1dge/dt = -ge/taue : siemensdgi/dt = -gi/taui : siemensalpham = 0.32*(mV**-1)*(13*mV-v+VT)/ \ (exp((13*mV-v+VT)/(4*mV))-1.)/ms : Hzbetam = 0.28*(mV**-1)*(v-VT-40*mV)/ \ (exp((v-VT-40*mV)/(5*mV))-1)/ms : Hzalphah = 0.128*exp((17*mV-v+VT)/(18*mV))/ms : Hzbetah = 4./(1+exp((40*mV-v+VT)/(5*mV)))/ms : Hzalphan = 0.032*(mV**-1)*(15*mV-v+VT)/ \ (exp((15*mV-v+VT)/(5*mV))-1.)/ms : Hzbetan = .5*exp((10*mV-v+VT)/(40*mV))/ms : Hz'''

P = NeuronGroup(4000, model=eqs, threshold=EmpiricalThreshold(threshold= -20 * mV, refractory=3 * ms), implicit=True)trace = StateMonitor(P, 'v', record=[1, 10, 100])

Adaptation

from brian import *

PG = PoissonGroup(1, 500 * Hz)eqs = '''dv/dt = (-w-v)/(10*ms) : voltdw/dt = -w/(30*ms) : volt # the adaptation current'''# The adaptation variable increases with each spikeIF = NeuronGroup(1, model=eqs, threshold=20 * mV, reset='''v = 0*mV w += 3*mV ''')

C = Connection(PG, IF, 'v', weight=3 * mV)

MS = SpikeMonitor(PG, True)Mv = StateMonitor(IF, 'v', record=True)Mw = StateMonitor(IF, 'w', record=True)

run(100 * ms)

plot(Mv.times / ms, Mv[0] / mV)plot(Mw.times / ms, Mw[0] / mV)

show()

membrane potential

adaptation current

linearized K+ current

Threshold adaptationfrom brian import *

eqs = '''dv/dt = -v/(10*ms) : voltdvt/dt = (10*mV-vt)/(15*ms) : volt'''

reset = '''v=0*mVvt+=3*mV'''

IF = NeuronGroup(1, model=eqs, reset=reset, threshold='v>vt')IF.rest()PG = PoissonGroup(1, 500 * Hz)

C = Connection(PG, IF, 'v', weight=3 * mV)

Mv = StateMonitor(IF, 'v', record=True)Mvt = StateMonitor(IF, 'vt', record=True)

run(100 * ms)

plot(Mv.times / ms, Mv[0] / mV)plot(Mvt.times / ms, Mvt[0] / mV)

show()

cortical neuron in vivo (V1)

By the way: the IF model is not a bad model of cortical neurons

Injected current (slice)

Fast spiking cortical cell

IF model with adaptive threshold

(from INCF competition)

The precision of spike timing

Mainen & Sejnowski (1995)

The same constant current is injected 25 times.

The timing of the first spike is reproducible.

The timing of the 10th spike is not.

In cortical neurons in vitro

With IF neuronsfrom brian import *

N = 25tau = 20 * mssigma = .015eqs_neurons = '''dx/dt=(1.1-x)/tau+sigma*(2./tau)**.5*xi:1'''neurons = NeuronGroup(N, model=eqs_neurons, threshold=1, reset=0, refractory=5 * ms)spikes = SpikeMonitor(neurons)

run(500 * ms)raster_plot(spikes)show()

gaussian white noise

With fluctuating current

Mainen & Sejnowski (1995)

The same temporally variable current is injected 25 times.

Spike timing is reproductible even after 1 s.

(cortical neuron in vitro, somatic injection)

IF neurons and fluctuating currenttau_input = 5 * msinput = NeuronGroup(1, model='dx/dt=-x/tau_input+(2./tau_input)**.5*xi:1')

tau = 10 * mssigma = .015eqs_neurons = '''dx/dt=(0.9+.5*I-x)/tau+sigma*(2./tau)**.5*xi:1I : 1'''neurons = NeuronGroup(25, model=eqs_neurons, threshold=1, reset=0, refractory=5 * ms)neurons.I = linked_var(input,'x‘)spikes = SpikeMonitor(neurons)

run(500 * ms)raster_plot(spikes)show()

Part II – Networks

A few examples

Localization of preys by scorpions

Inhibition of opposite neuron

→ more spikes on the source side

(polar representation of firing rates)Conversion temporal code → rate code

Code

Sturzl, W., R. Kempter, and J. L. van Hemmen (2000). Theory of arachnid prey localization. Physical Review Letters 84 (24), 5668

Sound localization by coincidence detection:Jeffress model

delay δδ+dleft=dright

synchronous inputs the neuron fires

Codedefaultclock.dt = .02 * mssound = TimedArray(10 * randn(50000)) # white noise

max_delay = 20 * cm / (300 * metre / second)angular_speed = 2 * pi * radian / second # 1 turn/secondtau_ear = 1 * mssigma_ear = .1eqs_ears = '''dx/dt=(sound(t-delay)-x)/tau_ear+sigma_ear*(2./tau_ear)**.5*xi : 1delay=distance*sin(theta) : seconddistance : second # distance to the centre of the head in time unitsdtheta/dt=angular_speed : radian'''ears = NeuronGroup(2, model=eqs_ears, threshold=1, reset=0, refractory=2.5 * ms)ears.distance = [-.5 * max_delay, .5 * max_delay]traces = StateMonitor(ears, 'x', record=True)

N = 300tau = 1 * mssigma = .1eqs_neurons = '''dv/dt=-v/tau+sigma*(2./tau)**.5*xi : 1'''neurons = NeuronGroup(N, model=eqs_neurons, threshold=1, reset=0)synapses = Connection(ears, neurons, 'v', structure='dense', delay=True, max_delay=1.1 * max_delay)synapses.connect_full(ears, neurons, weight=.5)synapses.delay[0, :] = linspace(0 * ms, 1.1 * max_delay, N)synapses.delay[1, :] = linspace(0 * ms, 1.1 * max_delay, N)[::-1]spikes = SpikeMonitor(neurons)

Sound

Receptors at the two ears

Coincidence detectors

Delay lines

Simulation of Jeffress model

dela

ys

(sound turning around the head)

« Synfire chains »: propagation of synchronous activity

(Die

smann e

t al, 1

99

9)

Layers of excitatory neurons:

each neuron = integrate-and-fire + noise

Neurons in layer 1 are simultaneously activated: propagation

If fewer neurons are activated, no propagation

(« Synfire chains »: term introduced par Abeles)

Synfire chains

layer 1 layer 2

standard deviation

a = number of spikes

Trajectories in space (,a)

attractor

synchronous propagation

dissipation

CodeVr = -70 * mVVt = -55 * mVtaum = 10 * mstaupsp = 0.325 * msweight = 4.86 * mVeqs = '''dV/dt=(-(V-Vr)+x)*(1./taum) : voltdx/dt=(-x+y)*(1./taupsp) : voltdy/dt=-y*(1./taupsp)+25.27*mV/ms+\ (39.24*mV/ms**0.5)*xi : volt'''# Neuron groupsP = NeuronGroup(N=1000, model=eqs, threshold=Vt, reset=Vr, refractory=1 * ms)Pinput = PulsePacket(t=50 * ms, n=85, sigma=1 * ms)# The network structurePgp = [ P.subgroup(100) for i in range(10)]C = Connection(P, P, 'y')for i in range(9): C.connect_full(Pgp[i], Pgp[i + 1], weight)Cinput = Connection(Pinput, Pgp[0], 'y')Cinput.connect_full(weight=weight)# Record the spikesMgp = [SpikeMonitor(p) for p in Pgp]Minput = SpikeMonitor(Pinput)monitors = [Minput] + Mgp# Setup the network, and run itP.V = Vr + rand(len(P)) * (Vt - Vr)run(100 * ms)

Spontaneous activity in a ring

tau = 10 * msN = 100v0 = 5 * mVsigma = 4 * mV

group = NeuronGroup(N, model='dv/dt=(v0-v)/tau + sigma*xi/tau**.5 : volt', \ threshold=10 * mV, reset=0 * mV)

C = Connection(group, group, 'v', weight=lambda i, j:.4*mV*cos(2*pi*(i-j)*1./N))

S = SpikeMonitor(group)R = PopulationRateMonitor(group)group.v = rand(N) * 10 * mV

run(5000 * ms)subplot(211)raster_plot(S)subplot(223)imshow(C.W.todense(), interpolation='nearest')title('Synaptic connections')subplot(224)plot(R.times / ms, R.smooth_rate(2 * ms, filter='flat'))title('Firing rate')show()

Part III - Plasticity

Hebb’s rule

D. Hebb

When an axon of cell A is near enough to excite cell B and repeatedly or persistently takes part in firing it, some growth process or metabolic change takes place in one or both cells such that A's efficiency, as one of the cells firing B, is increased. (1949)

A B

Neuron A and neuron B are active: wAB increases

Physiologically: « synaptic plasticity »

PSPPSP size is increased

(or: transmission probability is increased)

Synaptic plasticity at spike level

Presynaptic

spike

Post

syn

apti

c sp

ike

Dan &

Poo (

20

06

)

pre post: potentiation post pre: depression

• causal rule• favors synchronous inputs

(STDP = Spike-Timing-Dependent Plasticity)

Phenomenological model

postpost

post

prepre

pre

Adt

dA

Adt

dA

Presynaptic spike:

post

preprepre

Aww

AAA

Postsynaptic spike:

pre

postpostpost

Aww

AAA

0s if )(

0s if )(

)(

/

/

,

post

pre

spost

spre

ji

ipre

ipost

eAsf

eAsf

ttfw

Synaptic plasticity with Briansynapses=Connection(input,neurons,'ge') eqs_stdp='''dA_pre/dt=-A_pre/tau_pre : 1dA_post/dt=-A_post/tau_post : 1''‘stdp=STDP(synapses,eqs=eqs_stdp,pre='A_pre+=dA_pre;w+=A_post', post='A_post+=dA_post;w+=A_pre',wmax=gmax) maximum weight

synapses=Connection(input,neurons,'ge') stdp=ExponentialSTDP(synapses,tau_pre,tau_post,dA_pre,dA_post, wmax=gmax)

relative to wmax:

prepre dAwA max

Complete code

Song, S., K. D. Miller, and L. F. Abbott (2000). Competitive hebbian learning through spike timing-dependent synaptic plasticity. Nature Neurosci 3, 919-26.

A few properties of STDP Stability if:

Stationary distribution is bimodal

0 postpostprepre AA (depression > potentiation)

competitive mechanism

(inputs = Poisson spike trains)

N.B.: not bimodal if weight modification is multiplicative

Properties (2):

Stationary synaptic weights

not correlated correlated

Correlated inputs are favored

Properties (3): After convergence, firing is irregular (balanced

regime)

Short-term synaptic plasticity Synaptic efficacy depends on recent activity

depression(typically: exc exc)

facilitation(typically : exc inh)

Phenomenological model

uUdt

du

xdt

dx

f

d

1

Presynaptic spike:

)1(

)1(

uUuu

uxx

Synaptic efficacy: 1,0ux

depression

facilitation 1,0U

decreases by u*x (resource consumption)

increases (sensitization)

synapses=Connection(input,neurons,'ge') stp=STP(synapses,taud=50*ms,tauf=1*ms,U=0.6)

With Brian:

x = synaptic « resources »u = proportion of resources consumed by a spike

Example: facilitationtau_e = 3 * mstaum = 10 * msA_SE = 250 * pARm = 100 * MohmN = 10

eqs = '''dx/dt=rate : 1rate : Hz'''

input = NeuronGroup(N, model=eqs, threshold=1., reset=0)input.rate = linspace(5 * Hz, 30 * Hz, N)

eqs_neuron = '''dv/dt=(Rm*i-v)/taum:voltdi/dt=-i/tau_e:amp'''neuron = NeuronGroup(N, model=eqs_neuron)

C = Connection(input, neuron, 'i')C.connect_one_to_one(weight=A_SE)stp = STP(C, taud=1 * ms, tauf=100 * ms, U=.1trace = StateMonitor(neuron, 'v', record=[0, N - 1])

run(1000 * ms)subplot(211)plot(trace.times / ms, trace[0] / mV)subplot(212)plot(trace.times / ms, trace[N - 1] / mV)show()

regular spike trains

Python in 15 minutes

see also: http://docs.python.org/tutorial/

The Python console On Windows, open IDLE. On Linux: type python

interpreted language dynamic typing garbage collector space matters (signals structure) object-oriented many libraries

Writing a script If you use IDLE, click on File>New window.

Otherwise use any text editor.

Press F5 to execute

A simple program

# Factorial functiondef factorial(x): if x == 0: return 1 else: return x * factorial(x-1)

print factorial(5)

commentfunction definitionuntyped argument

condition

function calldisplay

structure by indentation(block = aligned instructions)

Numerical objects Base types: int, long, float, complex

Other numerical types (vectors, matrices) defined in the Numpy library (in a few minutes)

x=3+2x+=1y=100Lz=x*(1+2j)u=2.3/7x,y,z = 1,2,3a = b = 123

Control structures

x = 12if x < 5 or (x > 10 and x < 20): print "Good value."

if x < 5 or 10 < x < 20: print ‘Good value as well.'

for i in [0,1,2,3,4]: print "Number", i

for i in range(5): print "Number", i

while x >= 0: print "x is not always negative." x = x-1

list

the same list

Lists

mylist = [1,7,8,3]name = ["Jean","Michel"]x = [1,2,3,[7,8],"fin"]

print name[0]name[1]="Robert"print mylist[1:3]print mylist[:3],mylist[:]print mylist[-1]print x[3][1]

name.append("Georges")print mylist+nameprint mylist*2

concatenate

heterogeneous list

first element = index 0

« slice »: index 3 not included

last element

x[3] is a list

method (list = object)

Other methods: extend, insert, reverse, sort, remove…

List comprehensions

carres=[x**2 for i in range(10)]

pairs=[x for i in range(10) if i % 2 == 0]

= list of squares of integers from 0 to 9

= list of even integers between 0 and 9

Stringsa="Bonjour"b='hello'c="""Une phrasequi n'en finit pas"""

print a+bprint a[3:7]

print b.capitalize()

multiline string

≈ list of characters

many methods (find, replace, split…)

Dictionariesdico={‘one':1,‘two':2,‘three':‘several'}

print dico[‘three']

dico[‘four']=‘many‘del dico[‘one']

key value

for key in dico: print key,'->',dico[key]

iterate all keys

Functions

def power(x,exponent=2): return x**exponent

print power(3,2)

print power(7)

print power(exponent=3,x=2)

carre=lambda x:x**2 inline definition

default value

call with named arguments

Modulesimport mathprint math.exp(1)

from math import expprint exp(1)

from math import *print exp(1)

import only the exp object

import everything

You can work with several files (= modules), each one can define any number of objects (= variables, functions, classes)

loads the file ‘math.py’ or ‘math.pyc’ (compiled)

Scipy & Pylab

Scipy & Numpy Scientific libraries

Syntax ≈ Matlab Many mathematical functions

from scipy import *x=array([1,2,3])M=array([[1,2,3],[4,5,6]])M=ones((3,3))z=2*xy=dot(M,x)

from scipy.optimize import *print fsolve(lambda x:(x-1)*(x-3),2)

vector

matrix

matrix product

Vectors et matrices Base type in SciPy: array

(= vector or matrix)

from scipy import *x=array([1,2,3])M=array([[1,2,3],[4,5,6]])

M=ones((3,2))

z=2*x+1

y=dot(M,x)

vector (1,2,3)

matrix1 2 34 5 6

matrix1 11 11 1

matrix product

Operationsx+yx-yx*yx/yx**2exp(x)sqrt(x)

dot(x,y)dot(M,x)

M.TM.max()M.sum()

size(x)M.shape

element-wise

dot productmatrix product

transpose

total number of elements

Indexing

x[i]M[i,j]x[i:j]M[i,:]M[:,i]x[[1,3]]

x[1:3]=[0,1]M[1,:]+=x

Vector indexing lists (first element= 0)

(i+1)th element

slice from x[i] to x[j-1](i+1)th row(i+1)th columnelements x[1] and x[3]

x[1]=0, x[3]=1add vector x to the 2nd row of M

M[i,:] is a « view » on matrix M copy ( reference)

y=M[0,:]y[2]=5

x=zx[1]=3

M[0,2] is 5

z[1] is 3 x=z.copy()copy:

Construction

x=array([1,2,3])M=array([[1,2,3],[4,5,6]])

x=ones(5)M=zeros((3,2))M=eye(3)M=diag([1,3,7])

x=rand(5)x=randn(5)

x=arange(10)

x=linspace(0,1,100)

from lists

vector of 1szero matrixidentity matrixdiagonal matrix

random vector in (0,1)random gaussian vector

0,1,2,...,9

100 numbers between 0 and 1

SciPy example: optimisation

Simple example, least squares polynomial fit

Vectorisation How to write efficient programs? Replace loops by vector operations

for i in range(1000000): X[i]=1

X=ones(1000000)

for i in range(1000000): X[i]=X[i]*2

X=X*2

for i in range(999999): Y[i]=X[i+1]-X[i]

Y=X[1:]-X[:-1]

for i in range(1000000): if X[i]>0.5: Y[i]=1

Y[X>.5]=1

Pylab

Pylab Plotting library Syntax ≈ Matlab

Many plotting functions

from pylab import *plot([1,2,3],[4,5,6])show()

x y

last instruction of script

plot

polar

more examples:http://matplotlib.sourceforge.net/gallery.html

Plotting with PyLab

hist(randn(1000)) contour(gaussian_filter(randn(100, 100), 5))

imshow(gaussian_filter(randn(100, 100), 5))

specgram(sin(10000*2*pi*linspace(0,1,44100)),

Fs=44100)

Brian

At last!

Online help

Books Theoretical neuroscience, by Dayan &

Abbott, MIT Press

Spiking neuron models, by Gerstner & Kistler, Cambridge Univerity Press

Dynamical systems in neuroscience, by Izhikevich, MIT Press

Biophysics of computation, by Koch, Oxford University Press

Introduction to Theoretical Neurobiology, by Tuckwell, Cambridge University Press

romain.brette@ens.fr