Neural noise and neural signals - spontaneous activity and...
Transcript of Neural noise and neural signals - spontaneous activity and...
Neural noise and neural signals - spontaneous activity and signal transmission
in models of single nerve cells
Benjamin Lindner
Theory of Complex Systems and Neurophysics
Institut für PhysikHumboldt-Universität Berlin
1
Information theory
2
time
Information theory of neural spiking
3
• How do we quantify information?• What is the max info a spike train can carry?• How much info does the spike train carry
about the sensory signal?
time
Information theory of neural spiking
3
• How do we quantify information?
Information theory of neural spiking
4
• How do we quantify information?
Information theory of neural spiking
Shannon entropy (in bits)
mean information (reduction of uncertainty) we obtain by measuring the state of a discrete
system with probabilities
4
Last time: Information theory of neural spiking
• What is the max info a spike train can carry?
Specifically: a stationary spike train with rate
0 0 0 1 0 0 0 0 0 0 0 0 0 1 0 0 1 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0
6t
N bins
5
Last time: Information theory of neural spiking
• What is the max info a spike train can carry?
Specifically: a stationary spike train with rate
Information rate of a Poisson process
0 0 0 1 0 0 0 0 0 0 0 0 0 1 0 0 1 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0
6t
N bins
5
Last time: Information theory of neural spiking
• General Insight: Maximal entropy depends on constraints?
Fixed finite range:
Fixed mean, semi-infinite range:
Fixed mean and variance, infinite range:
uniform
exponential
Gauss
6
• Last time: How much info does the spike train carry about the sensory signal?
noisy neuron output spike traininput signal
1.Compute full entropy of the output2.For frozen stimulus calculate noise entropy unrelated to the stimulus3.Take the difference Mutual information
7
time window = 0.2s
time
The direct method of determining mutual informationfor a spiking neuron
8
S = !
!
Sequenzen
pilog2[pi]
(Claude Shannon,1948)
Entropy of bit sequences
sequences
Time
9
Variabilitywithout
reference to the signal
N = !
!
"
Sequenzen
p̃ilog2[p̃i]
#
stimulus
Entropy with frozen signal
sequences
Time
10
Mutual information rate
Mutual information
11
H1 Neuron in the visual system of the blow fly (Strong et al. Phys. Rev. Lett. 1998)
The direct method for determining the info rate
12
H1 Neuron in the visual system of the blow fly (Strong et al. Phys. Rev. Lett. 1998)
The direct method for determining the info rate
12
Finite-size corrections to the entropy
H1 Neuron in the visual system of the blow fly (Strong et al. Phys. Rev. Lett. 1998)
The direct method for determining the info rate
12
Finite-size corrections to the entropy
H1 Neuron in the visual system of the blow fly (Strong et al. Phys. Rev. Lett. 1998)
The direct method for determining the info rate
12
Finite-size corrections to the entropy
H1 Neuron in the visual system of the blow fly (Strong et al. Phys. Rev. Lett. 1998)
The direct method for determining the info rate
Finite-window corrections to the entropy rate
13
Finite-size corrections to the entropy
H1 Neuron in the visual system of the blow fly (Strong et al. Phys. Rev. Lett. 1998)
The direct method for determining the info rate
Finite-window corrections to the entropy rate
13
Information rates of real neurons
Borst & Theunissen Nat. Neurosci. (1999)
14
Information rates of real neurons
Borst & Theunissen Nat. Neurosci. (1999)
14
Information rates of real neurons
Borst & Theunissen Nat. Neurosci. (1999)
14
Last time: The Gaussian channel
signal
noise
Assumptions
•statistically independent•Gaussian with zero mean
and
15
The dynamic Gaussian channel
signal
noise
16
The dynamic Gaussian channel
signal
noise
Assumptions
•statistically independent•Gaussian with zero mean•power spectra
and
16
The dynamic Gaussian channel
signal
noise
Assumptions
•statistically independent•Gaussian with zero mean•power spectra
and
16
The dynamic Gaussian channel (more general)
signal
noise
17
The dynamic Gaussian channel (more general)
signal
noise
Assumptions
•statistically independent•Gaussian with zero mean•power spectra
and
17
The dynamic Gaussian channel (more general)
signal
noise
Assumptions
•statistically independent•Gaussian with zero mean•power spectra
and
17
18
neuron
stimulus
A lower bound on the mutual information
spike train
19
linearsignal
reconstructionneuron
stimulus
Data processing inequality:
spike train
A lower bound on the mutual information
estimatedstimulus
19
linearsignal
reconstructionneuron
stimulus
Data processing inequality:
spike train
A lower bound on the mutual information
estimatedstimulus
Lower bound
C(f) =|Sx,s(f)|2
Sx,x(f)Ss,s(f)
Spectral Coherence function Cross-spectrum (stimulus-spike train)
Stimulus power spectrumSpike train power spectrum
A lower bound on the mutual information
Lower bound
Comparison of direct & lower-bound methods
Aldworth et al. PLoS Comp. Biol. (2011)
21
Using the coherence function to discuss frequency-dependent information transmission
Middleton et al. J. Neurophysiol. (2009) Chacron et al. Nature (2003)
P-units Pyramidal cells
Weakly electric fish
22
Using the coherence function to discuss frequency-dependent information transmission
Paddle fish
Neiman & Russell Chaos (2011)
23
Using the coherence function to discuss frequency-dependent information transmission
Monkey, vestibular system
24
Bullfrog(
Using the coherence function to discuss frequency-dependent information transmission
25
Summary: information theory of spike trains
-information can be quantified by entropies and differences between them (mutual information)
-mutual information can be determined directly (from lots of data!) or can be estimated from below (lower bound)
-information rates are often not far from their theoretical limit; they are higher for natural stimuli
References
- Dayan & Abbott Theoretical Neuroscience MIT Press (2001)
- Gerstner & Kistler Spiking Neuron Models Cambridge University Press (2002)
- Rieke et al. Spikes: Exploring the neural code MIT Press (1996)
- Pierce An Introduction to Information Theory Dover (1980)
26