Computational Modeling of Neural Systemseero/compNeuro05/simoncelliLecture1.pdf · • Format...

22
Computational Modeling of Neural Systems Format (weekly lectures, background + case studies) Prerequisites (basic diff eq, lin alg, lin sys, prob/stats) Grades (notetaking, homework, project) Books / materials (Dayan/Abbott, slides/ notes online) http://www.cns.nyu.edu/doctoral/courses/2005-2006/fall/compNeuro05/

Transcript of Computational Modeling of Neural Systemseero/compNeuro05/simoncelliLecture1.pdf · • Format...

Page 1: Computational Modeling of Neural Systemseero/compNeuro05/simoncelliLecture1.pdf · • Format (weekly lectures, background + case studies) • Prerequisites (basic diff eq, lin alg,

Computational Modeling of Neural Systems

• Format (weekly lectures, background + case studies)

• Prerequisites (basic diff eq, lin alg, lin sys, prob/stats)

• Grades (notetaking, homework, project)

• Books / materials (Dayan/Abbott, slides/notes online)• http://www.cns.nyu.edu/doctoral/courses/2005-2006/fall/compNeuro05/

Page 2: Computational Modeling of Neural Systemseero/compNeuro05/simoncelliLecture1.pdf · • Format (weekly lectures, background + case studies) • Prerequisites (basic diff eq, lin alg,

Role of Models

• Descriptive (what?)• eg: tuning curves, receptive field, LNP

• Mechanistic (how?)• eg: compartmental models, Hodgkin-Huxley

• Interpretive/Explanatory (why?)• eg: efficient coding, optimal estimation/decision,

wiring length, metabolic cost, etc

- Dayan & Abbott

Page 3: Computational Modeling of Neural Systemseero/compNeuro05/simoncelliLecture1.pdf · • Format (weekly lectures, background + case studies) • Prerequisites (basic diff eq, lin alg,

Interaction with Experiments• Fit existing data • Make predictions...

- for other neurons, under other conditions- of mechanisms not yet understood (e.g., HH)- of behavior- in other animals/species- that can be tested with new experiments

• Develop new experiments...- to refine model- to differentiate models- with optimized stimuli

Page 4: Computational Modeling of Neural Systemseero/compNeuro05/simoncelliLecture1.pdf · • Format (weekly lectures, background + case studies) • Prerequisites (basic diff eq, lin alg,

Descriptive Response Models(outline)

• Receptive fields and tuning curves

• Linear models

• Rate models

• Wiener/Volterra (polynomial) models

• LN models

• Poisson spiking

• Fitting/validating LNP models

Page 5: Computational Modeling of Neural Systemseero/compNeuro05/simoncelliLecture1.pdf · • Format (weekly lectures, background + case studies) • Prerequisites (basic diff eq, lin alg,

- Dayan & Abbott

Page 6: Computational Modeling of Neural Systemseero/compNeuro05/simoncelliLecture1.pdf · • Format (weekly lectures, background + case studies) • Prerequisites (basic diff eq, lin alg,

Rate coding

Page 7: Computational Modeling of Neural Systemseero/compNeuro05/simoncelliLecture1.pdf · • Format (weekly lectures, background + case studies) • Prerequisites (basic diff eq, lin alg,

Receptive Fields

• Classical: A region of the retina (visual field) that must be stimulated directly in order to obtain a response in a neuron

- Sherrington (1906), Hartline (1938), Kuffler (1953)

Page 8: Computational Modeling of Neural Systemseero/compNeuro05/simoncelliLecture1.pdf · • Format (weekly lectures, background + case studies) • Prerequisites (basic diff eq, lin alg,

- Kuffler & Nichols

Page 9: Computational Modeling of Neural Systemseero/compNeuro05/simoncelliLecture1.pdf · • Format (weekly lectures, background + case studies) • Prerequisites (basic diff eq, lin alg,

Receptive Fields

• Classical: A region of the retina (visual field) that must be stimulated directly in order to obtain a response in a neuron

- Sherrington (1906), Hartline (1938), Kuffler (1953)

• Modern generalization: Kernel that captures those attributes of the stimulus that generate/modulate responses. Often assumed linear.

Page 10: Computational Modeling of Neural Systemseero/compNeuro05/simoncelliLecture1.pdf · • Format (weekly lectures, background + case studies) • Prerequisites (basic diff eq, lin alg,

- Dayan & Abbott, after Hubel & Wiesel ‘62

Page 11: Computational Modeling of Neural Systemseero/compNeuro05/simoncelliLecture1.pdf · • Format (weekly lectures, background + case studies) • Prerequisites (basic diff eq, lin alg,

- Dayan & Abbott, after Georgopoulos et al, 82

Page 12: Computational Modeling of Neural Systemseero/compNeuro05/simoncelliLecture1.pdf · • Format (weekly lectures, background + case studies) • Prerequisites (basic diff eq, lin alg,
Page 13: Computational Modeling of Neural Systemseero/compNeuro05/simoncelliLecture1.pdf · • Format (weekly lectures, background + case studies) • Prerequisites (basic diff eq, lin alg,

- Dayan & Abbott, after Bair & Koch 96

Page 14: Computational Modeling of Neural Systemseero/compNeuro05/simoncelliLecture1.pdf · • Format (weekly lectures, background + case studies) • Prerequisites (basic diff eq, lin alg,

Estimating firing rates

- Dayan & Abbott

Page 15: Computational Modeling of Neural Systemseero/compNeuro05/simoncelliLecture1.pdf · • Format (weekly lectures, background + case studies) • Prerequisites (basic diff eq, lin alg,

Mapping stimulus -> rate

Page 16: Computational Modeling of Neural Systemseero/compNeuro05/simoncelliLecture1.pdf · • Format (weekly lectures, background + case studies) • Prerequisites (basic diff eq, lin alg,

Polynomial Model (Volterra/Weiner Kernels)

const vector matrix 3-tensor

1 n(20)

n3(8000)

n2(400)

# pars:

• estimate kernels using moments of spike-triggered stimuli

• in practice, insufficient data to go beyond 2nd order

r(!x) = k0 + !k1 · !x + !xTK2!x + . . .

Page 17: Computational Modeling of Neural Systemseero/compNeuro05/simoncelliLecture1.pdf · • Format (weekly lectures, background + case studies) • Prerequisites (basic diff eq, lin alg,

was an accelerating nonlinearity typical of the great majority ofRGCs recorded. Some saturation was also evident at high valuesof the generator signal; this was commonly observed with highcontrast stimuli. Were RGC light responses linear in the presentconditions, the functions in Figure 3, B and D, would be linear.The departure from this prediction indicates that the nonlinearsecond stage of the LN model captured a significant feature of thelight response that can create systematic errors in a more restric-tive linear analysis of adaptation (see below).

Response nonlinearity in RGCs does not dependon contrastTo examine temporal contrast adaptation, RGC light responseswere characterized as above using random flicker stimuli of lowand high contrast. Thus, the stimulus that controlled the state ofadaptation was simultaneously used to probe light responses inthat state. Because the slowest known component of contrastadaptation operates over tens of seconds (Fig. 2) (Smirnakis etal., 1997; Kim and Rieke, 2001), responses from the first minuteof recording (more than three slow adaptation time constants)(Fig. 2) at each contrast level were excluded to confine analysis tothe steady state.

Figure 4A shows low and high contrast STAs from a represen-tative OFF cell. The corresponding nonlinearities are shown inFigure 4B. Because the units of the generator signal in the LNmodel are indeterminate, the linear filter in each condition isknown only up to a single, arbitrary scale factor. However, thelinear filters obtained with different contrasts can be meaning-

fully compared if scaling the amplitude of the high contrast linearfilter, and consequently scaling the abscissa of the high contrastnonlinearity, brings the nonlinearities for low and high contrastinto register, as is shown in Figure 4D (see Materials and Meth-ods for details). This superposition of nonlinearities in high andlow contrast implies that the changes in visual signaling caused bycontrast adaptation were attributable to changes in the linearfilter, shown in Figure 4C.

Figure 3. Characterization of light response in one ON cell (A, B) andone OFF cell (C, D) simultaneously recorded in salamander retina. A, C,The spike-triggered average L-cone contrast during random flicker stim-ulation is plotted as a function of time relative to the spike. This isproportional to the linear filtering of recent visual inputs. B, D, Spike rateis shown as a function of the estimated generator signal (stimulusweighted by linear filter), averaged over many time points during stimu-lation. Vertical (horizontal) error bars indicate the SE of spike rate(generator signal) for each such average; most error bars are smaller thanthe symbols. Smooth curve is a parametrized form of the cumulativenormal distribution, shifted and scaled to fit the data. Stimulus: 33 Hzbinary random flicker, 34% L-cone contrast.

Figure 4. Effect of contrast adaptation on light responses in salamanderRGCs. A, STAs for a single OFF cell obtained with low (17%, blacktrace) and high (34%, gray trace) contrast stimulation. Stimulus: 33 HzL-cone binary random flicker. B, Corresponding nonlinearities for lowcontrast (E) and high contrast (!). Error bars represent !1 SEM (see Fig.3). C, Linear filters: the low contrast STA, and the high contrast STAscaled by 0.35, are shown with black and gray lines, respectively. The highcontrast filter obtained with linear analysis is shown with a dashed grayline. For comparison with the low contrast filter, this was scaled so that itspeak divided by the peak of the low contrast filter equals the ratio of thepeaks of the high and low contrast filters obtained with linear analysis. D,Superimposed nonlinearities from four repeats of low contrast stimula-tion, and three repeats of high contrast stimulation with abscissa scaled by0.35. E, Peak sensitivity (solid black) and time to zero crossing (dashedgray) of the linear filter relative to the first low contrast filter for alter-nating low and high contrast stimulation. F, Fractional change in peaksensitivity and time to zero (relative to low contrast) for 24 simulta-neously recorded cells including the cell in A–E.

9908 J. Neurosci., December 15, 2001, 21(24):9904–9916 Chander and Chichilnisky • Contrast Adaptation in Primate and Salamander Retina

input strength

- Chander & Chichilnisky 01

Page 18: Computational Modeling of Neural Systemseero/compNeuro05/simoncelliLecture1.pdf · • Format (weekly lectures, background + case studies) • Prerequisites (basic diff eq, lin alg,

-3 -2 -1 0 1 2 3-50

0

50

100

150

rate

(sp

/s)

stimulus projection

true responselinear modelquad model

Low-order polynomials do a poor job of representing the nonlinearities found in neurons

Page 19: Computational Modeling of Neural Systemseero/compNeuro05/simoncelliLecture1.pdf · • Format (weekly lectures, background + case studies) • Prerequisites (basic diff eq, lin alg,

• Threshold-like nonlinearity => linear classifier

• Classic model for Artificial Neural Networks

- McCullough & Pitts (1943), Rosenblatt (1957), etc

• No spikes (output is firing rate)

LN cascade model

Page 20: Computational Modeling of Neural Systemseero/compNeuro05/simoncelliLecture1.pdf · • Format (weekly lectures, background + case studies) • Prerequisites (basic diff eq, lin alg,

Rate model => Poisson spiking

• Central intuition: assume spikes in non-overlapping intervals are independent.

[on board:]

• Count distribution

• ISI distribution

• Inhomogeneous version

• Variability and Fano factor

Page 21: Computational Modeling of Neural Systemseero/compNeuro05/simoncelliLecture1.pdf · • Format (weekly lectures, background + case studies) • Prerequisites (basic diff eq, lin alg,

Spiking variability

σ2= AµB

- Dayan & Abbott, after O’Keefe etal 97

Page 22: Computational Modeling of Neural Systemseero/compNeuro05/simoncelliLecture1.pdf · • Format (weekly lectures, background + case studies) • Prerequisites (basic diff eq, lin alg,

ISI’s

Left: ISI from an MT neuron

Right: Poisson model, corrected with a stochastic refractory period

- Dayan & Abbott, after Bair etal 94