Real MEA Data...[2]D. Patnaik, S. Laxman, and N. Ramakrishnan. Discovering exci-tatory networks from...

1
Learning Probabilistic Models of Connectivity from Multiple Spike Train Data Debprakash Patnaik ? , Srivatsan Laxman and Naren Ramakrishnan ? ? Dept. of Computer Science, Virginia Tech, Blacksburg, VA Microsoft Research, Bangalore, India E-mail: [email protected] Multi-Neuronal Data Neuronal circuits carry out brain function through complex coordi- nated firing patterns. Inferring topology of neuronal circuits from spike train data is chal- lenging and hard. MEA chip Calcium Imaging Spike Train Recording Figure 1: Simultaneously recorded multi-neuron data Probabilistic Models Probabilistic Models, such as Bayesian Networks, provide compact factorizations of joint probability distributions. The probability of spiking of a neuron is conditioned on the activity of a subset of relevant neurons in recent past (or history window). Learning probability models from spike train data is a hard problem. Most efficient methods are heuristic. Prob(X 1 ...X n = x 1 ...x n )= n Y i=1 Prob(X i = x i |Parent(X i )) X 6 X 1 X 5 X 2 X 1 X 2 X 5 P[X 6 =0] P[X 6 =1] 0 0 0 0.9 0.1 0 0 1 0.9 0.1 . . . . . 1 1 0 0.9 0.1 1 1 1 0.1 0.9 Conditional Probability Table Figure 2: A typical Bayesian Network Excitatory Dynamic Networks (EDNs) We define a special class of models, Excitatory Dynamic Networks: Neurons can only exert excitatory influences on one another. F G A 1 1 1 B 1 1 C 1 1 1 D 1 1 1 1 Events Time X A (t) 1 X A (t) 1 1 1 1 1 1 Ind. Parent Comp. {B,C,D} excite A {F, G} also excite A B C D F G Prob[A firing] 0 0 0 0 0 LOW . . . . . LOW . . . 1 1 HIGH 1 1 1 . . HIGH Excitatory Dynamic Network Figure 3: Independent parent components in our Excitatory Dynamic Network (EDN) formulation. Method Our emphasis on excitatory networks enables: Learning of connectivity models by exploiting fast and efficient data mining algorithms [2]. EDN Structure Learning Structure Learning requires identifying high mutual informal parent sets. We formally establish a connection between efficient frequent episode mining algorithms and learning probabilistic models for ex- citatory connections. Frequent Episode Mining is used to identify frequently repeating patterns of spiking activity [3]. A 1 1 1 B 1 1 C 1 1 1 D 1 1 1 1 Events Time History Window X A (t) 1 Parent(X A (t)) belongs to History Window Figure 4: Search for high mutual information parent set restricted to immediate history window. Theorem 1 Consider node X A in an excitatory DBN with parent-set Π. Let * be an upper-bound for P [X A =1 | Π= a] for all a 6= a * (= 1). I [X A ; Π] implies P [X A =1, Π= a * ] P min Φ min , where P min = P [X A = 1] - * 1 - * (1) Φ min = h -1 min 1, h(P [X A = 1]) - ϑ P min (2) and where h(·) denotes the binary entropy function h(q )= -q log q - (1 - q ) log(1 - q ), o<q< 1 and h -1 [·] denotes its pre-image greater than 1 2 . B C D P[A=1|Parents] 0 0 0 < Є * 0 0 1 < Є * 0 1 0 < Є * 0 1 1 < Є * 1 0 0 < Є * 1 0 1 < Є * 1 1 0 < Є * 1 1 1 > φ {B=1, C=1, D=1} P(A=1) > ϕ support High Mutual Information Parent set Figure 5: Search for high mutual information parent sets translates to finding frequent episodes. Frequent Episode Mining Serial Episodes: Patterns of the form hB 1 C 2 D 2 Ai Frequent: σ (B 1 C 2 D 2 A)= count T supp. threshold Efficient Algorithm: Level-wise mining Candidate generation Counting Retain frequent episodes. Counting: Maximum number of non-overlapped occurrences. A 1 1 1 1 B 1 1 1 C 1 1 1 D 1 1 1 1 1 Events Time 1 1 1 1 1 1 1 1 Episode = B C D A 1 2 2 Figure 6: Frequent Episode Mining - Fast and efficient data mining algorithm. 1 1 1 1 1 1 1 1 1 1 + A B C D A B C D A B C D Frequent Frequent Candidate Figure 7: Level-wise candidate generation in frequent episode min- ing. Results Synthetic Data Generation Synthetic data generation models each neuron as an Inhomoge- neous Poisson Process. Firing rate is modulated by the spikes received by neuron in recent past. Σ 10 ms 8 ms 5 ms 9 ms 7 ms λ(t) Figure 8: Simulation Model of a single neuron. Synfire Chains A volley of firing in one group of neurons causes next group to fire and activity propagates over the network. The gray boxes show the MEA view of the activity. t=0 t=1 t=3 Figure 9: Discovering Synfire network structure. Polychronous Circuit In polychronous circuits neurons code information through precise spike timing and variable network delays. Complex patterns can be stored and processed by such networks [1]. Figure 10: Discovering Polychronous network structure. Real MEA Data Application of our method on multi-electrode arrays recordings from dissociated cortical cultures gathered by Steve Potter’s laboratory at Georgia Tech [4]. Figure 11: Network structure discovered from first 15 min of spike train recording on day 35 of culture 2-1. Conclusion Excitatory Dynamic Networks: We provide a formal basis for learn- ing a special class of models from spike train data. Efficient Learning: Excitatory network assumption allows the use of connect fast frequent episode mining algorithms to learn network structures. Application to Spike Train analysis: We show that network dynam- ics like Synfire Chains, Polychrony etc. can be modeled as excita- tory networks and can be unearthed using EDN Learning. Record Activity Find Repeating Patterns Infer Network Connectivity Figure 12: Framework for discovering Excitatory Dynamic Networks. References [1] E. M. Izhikevich. Polychronization: computation with spikes. Neu- ral Comput, 18(2):245–282, Feb 2006. [2]D. Patnaik, S. Laxman, and N. Ramakrishnan. Discovering exci- tatory networks from discrete event streams with applications to neuronal spike train analysis. In Proc. of Ninth IEEE Intl. Conf. on Data Mining, pages 407–416, 2009. [3]D. Patnaik, P. S. Sastry, and K. P. Unnikrishnan. Inferring neuronal network connectivity from spike data: A temporal data mining ap- proach. Scientific Programming, 16(1):49–77, January 2007. [4]D. A. Wagenaar, J. Pine, and S. M. Potter. An extremely rich reper- toire of bursting patterns during the development of cortical cul- tures. BMC Neurosci, 7:11–11, 2006. 9 th Annual Computational Neuroscience Meeting CNS*2010,July 24th – 30th 2010, San Antonio, Texas

Transcript of Real MEA Data...[2]D. Patnaik, S. Laxman, and N. Ramakrishnan. Discovering exci-tatory networks from...

Page 1: Real MEA Data...[2]D. Patnaik, S. Laxman, and N. Ramakrishnan. Discovering exci-tatory networks from discrete event streams with applications to neuronal spike train analysis. In Proc.

Learning Probabilistic Models of Connectivity from Multiple Spike Train DataDebprakash Patnaik?, Srivatsan Laxman† and Naren Ramakrishnan?

?Dept. of Computer Science, Virginia Tech, Blacksburg, VA†Microsoft Research, Bangalore, India

E-mail: [email protected]

Multi-Neuronal Data•Neuronal circuits carry out brain function through complex coordi-

nated firing patterns.• Inferring topology of neuronal circuits from spike train data is chal-

lenging and hard.

MEA chip

Calcium Imaging

Spike Train Recording

Figure 1: Simultaneously recorded multi-neuron data

Probabilistic Models• Probabilistic Models, such as Bayesian Networks, provide compact

factorizations of joint probability distributions.• The probability of spiking of a neuron is conditioned on the activity

of a subset of relevant neurons in recent past (or history window).• Learning probability models from spike train data is a hard problem.

Most efficient methods are heuristic.

Prob(X1 . . . Xn = x1 . . . xn) =

n∏i=1

Prob(Xi = xi|Parent(Xi))

X6

X1 X5

X2 X1 X2 X5 P[X6=0] P[X6=1]

0 0 0 0.9 0.1

0 0 1 0.9 0.1

. . . .   .  

1 1 0 0.9 0.1

1 1 1 0.1 0.9

Conditional Probability Table

Figure 2: A typical Bayesian Network

Excitatory Dynamic Networks (EDNs)We define a special class of models, Excitatory Dynamic Networks:•Neurons can only exert excitatory influences on one another.

F G A 1 1 1 B 1 1 C 1 1 1 D 1 1 1 1

Eve

nts

Time

XA(t) 1 XA(t) 1

1

1

1

1

1

    Ind. Parent Comp.   {B,C,D} excite A   {F, G} also excite A

B C D F G Prob[A firing]

0 0 0 0 0 LOW

. . . . . LOW

. . . 1 1 HIGH

1 1 1 . . HIGH

Excitatory Dynamic Network

Figure 3: Independent parent components in our Excitatory DynamicNetwork (EDN) formulation.

MethodOur emphasis on excitatory networks enables:• Learning of connectivity models by exploiting fast and efficient data

mining algorithms [2].

EDN Structure Learning• Structure Learning requires identifying high mutual informal parent

sets.•We formally establish a connection between efficient frequent

episode mining algorithms and learning probabilistic models for ex-citatory connections.• Frequent Episode Mining is used to identify frequently repeating

patterns of spiking activity [3].

A 1 1 1

B 1 1

C 1 1 1

D 1 1 1 1 E

vent

s

Time

History Window

XA(t) 1

Parent(XA(t)) belongs to History Window

Figure 4: Search for high mutual information parent set restricted toimmediate history window.

Theorem 1 Consider node XA in an excitatory DBN with parent-setΠ. Let ε∗ be an upper-bound for P [XA = 1 | Π = a] for all a 6= a∗(= 1).I [XA ; Π] > ϑ implies P [XA = 1,Π = a∗] ≥ PminΦmin, where

Pmin =P [XA = 1]− ε∗

1− ε∗(1)

Φmin = h−1[

min

(1,h(P [XA = 1])− ϑ

Pmin

)](2)

and where h(·) denotes the binary entropy function h(q) = −q log q −(1 − q) log(1 − q), o < q < 1 and h−1[·] denotes its pre-image greaterthan 1

2.

B C D P[A=1|Parents]

0 0 0 < Є*

0 0 1 < Є*

0 1 0 < Є*

0 1 1 < Є*

1 0 0 < Є*

1 0 1 < Є*

1 1 0 < Є*

1 1 1 > φ {B=1, C=1, D=1} P(A=1) > ϕ support

High Mutual Information Parent set

Figure 5: Search for high mutual information parent sets translates tofinding frequent episodes.

Frequent Episode Mining

Serial Episodes: Patterns of the form 〈B 1→ C2→ D

2→ A〉

Frequent: σ(B1→ C

2→ D2→ A) = count

T > ϑ supp. thresholdEfficient Algorithm: Level-wise mining

Candidate generation→ Counting→ Retain frequent episodes.Counting: Maximum number of non-overlapped occurrences.

A 1 1 1 1

B 1 1 1

C 1 1 1

D 1 1 1 1 1

Eve

nts

Time

1

1

1

1

1

1

1

1

Episode = B C D A 1 2 2

Figure 6: Frequent Episode Mining - Fast and efficient data miningalgorithm.

1

1

1

1

1

1

1

1

1

1 + A

B

C

D

A

B

C

D

A

B

C

D

Frequent Frequent Candidate

Figure 7: Level-wise candidate generation in frequent episode min-ing.

Results

Synthetic Data Generation• Synthetic data generation models each neuron as an Inhomoge-

neous Poisson Process.• Firing rate is modulated by the spikes received by neuron in recent

past.

Σ 10 ms

8 ms

5 ms

9 ms

7 ms λ(t)

Figure 8: Simulation Model of a single neuron.

Synfire ChainsA volley of firing in one group of neurons causes next group to fireand activity propagates over the network. The gray boxes show theMEA view of the activity.

t=0

t=1

t=3

Figure 9: Discovering Synfire network structure.

Polychronous CircuitIn polychronous circuits neurons code information through precisespike timing and variable network delays. Complex patterns can bestored and processed by such networks [1].

Figure 10: Discovering Polychronous network structure.

Real MEA DataApplication of our method on multi-electrode arrays recordings fromdissociated cortical cultures gathered by Steve Potter’s laboratory atGeorgia Tech [4].

Figure 11: Network structure discovered from first 15 min of spiketrain recording on day 35 of culture 2-1.

Conclusion

Excitatory Dynamic Networks: We provide a formal basis for learn-ing a special class of models from spike train data.

Efficient Learning: Excitatory network assumption allows the use ofconnect fast frequent episode mining algorithms to learn networkstructures.

Application to Spike Train analysis: We show that network dynam-ics like Synfire Chains, Polychrony etc. can be modeled as excita-tory networks and can be unearthed using EDN Learning.

Record Activity Find Repeating Patterns

Infer Network Connectivity

Figure 12: Framework for discovering Excitatory Dynamic Networks.

References

[1] E. M. Izhikevich. Polychronization: computation with spikes. Neu-ral Comput, 18(2):245–282, Feb 2006.

[2] D. Patnaik, S. Laxman, and N. Ramakrishnan. Discovering exci-tatory networks from discrete event streams with applications toneuronal spike train analysis. In Proc. of Ninth IEEE Intl. Conf. onData Mining, pages 407–416, 2009.

[3] D. Patnaik, P. S. Sastry, and K. P. Unnikrishnan. Inferring neuronalnetwork connectivity from spike data: A temporal data mining ap-proach. Scientific Programming, 16(1):49–77, January 2007.

[4] D. A. Wagenaar, J. Pine, and S. M. Potter. An extremely rich reper-toire of bursting patterns during the development of cortical cul-tures. BMC Neurosci, 7:11–11, 2006.

9th Annual Computational Neuroscience Meeting CNS*2010,July 24th – 30th 2010, San Antonio, Texas