Microscopic approach of a time elapsed neural model
Transcript of Microscopic approach of a time elapsed neural model
![Page 1: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/1.jpg)
Microscopic approach of a time elapsed neural model
J. Chevallier M. J. Cacérès M. Doumic P. Reynaud-Bouret
LJAD University of Nice
Séminaire / Rennes
9 Mars 2015
![Page 2: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/2.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
Outline
1 Introduction
2 Point process
3 Microscopic measure
4 Expectation measure
5 Coming back to our examples
![Page 3: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/3.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
Outline
1 IntroductionNeurobiologic interestModelisation
2 Point process
3 Microscopic measure
4 Expectation measure
5 Coming back to our examples
![Page 4: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/4.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
Neurobiologic interest
Biological context
Actionpotential
Vol
tage
(m
V)
Dep
olar
izat
ion R
epolarization
Threshold
Stimulus
Failedinitiations
Resting state
Refractoryperiod
+40
0
-55
-70
0 1 2 3 4 5Time (ms)
Action potential: brief and stereotyped phenomenon.
Physiological constraint: refractory period.
![Page 5: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/5.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
Modelisation
Age structured equations (K. Pakdaman, B. Perthame, D. Salort, 2010)
Age = delay since last spike.
n(t,s) =
{probability density of finding a neuron with age s at time t.
ratio of the population with age s at time t.∂n (t,s)
∂ t+
∂n (t,s)
∂ s+p (s,X (t))n (t,s) = 0
m (t) := n (t,0) =∫ +∞
0p (s,X (t))n (t,s)ds
(PPS)
Parameters
p represents the firing rate. For example, p(s,X ) = 1{s>σ(X )}.
X (t) =∫ t
0d(x)m(t−x)dx (global neural activity)
Propagation time. d = delay function. For example, d(x) = e−τx .
![Page 6: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/6.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
Modelisation
Age structured equations (K. Pakdaman, B. Perthame, D. Salort, 2010)
Age = delay since last spike.
n(t,s) =
{probability density of finding a neuron with age s at time t.
ratio of the population with age s at time t.∂n (t,s)
∂ t+
∂n (t,s)
∂ s+p (s,X (t))n (t,s) = 0
m (t) := n (t,0) =∫ +∞
0p (s,X (t))n (t,s)ds
(PPS)
Parameters
p represents the firing rate. For example, p(s,X ) = 1{s>σ(X )}.
X (t) =∫ t
0d(x)m(t−x)dx (global neural activity)
Propagation time. d = delay function. For example, d(x) = e−τx .
![Page 7: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/7.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
Modelisation
Age structured equations (K. Pakdaman, B. Perthame, D. Salort, 2010)
Age = delay since last spike.
n(t,s) =
{probability density of finding a neuron with age s at time t.
ratio of the population with age s at time t.∂n (t,s)
∂ t+
∂n (t,s)
∂ s+p (s,X (t))n (t,s) = 0
m (t) := n (t,0) =∫ +∞
0p (s,X (t))n (t,s)ds
(PPS)
Parameters
p represents the firing rate. For example, p(s,X ) = 1{s>σ(X )}.
X (t) =∫ t
0d(x)m(t−x)dx (global neural activity)
Propagation time. d = delay function. For example, d(x) = e−τx .
![Page 8: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/8.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
Modelisation
Microscopic modelling
The spiking times are the relevant information.
Microscopic modelling
Time point processes = random countable sets of times (points of R or R+).
N is a random countable set of points of R (or R+) locally finite a.s.
Denote · · ·<T−1 <T0 ≤ 0<T1 < .. . the ordered sequence of points of N.
N(A) = number of points of N in A.
Point measure: N(dt) = ∑i∈Z δTi(dt). Hence,
∫f (t)N(dt) = ∑i∈Z f (Ti ).
![Page 9: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/9.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
Modelisation
Microscopic modelling
The spiking times are the relevant information.
Microscopic modelling
Time point processes = random countable sets of times (points of R or R+).
N is a random countable set of points of R (or R+) locally finite a.s.
Denote · · ·<T−1 <T0 ≤ 0<T1 < .. . the ordered sequence of points of N.
N(A) = number of points of N in A.
Point measure: N(dt) = ∑i∈Z δTi(dt). Hence,
∫f (t)N(dt) = ∑i∈Z f (Ti ).
![Page 10: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/10.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
Modelisation
Microscopic modelling
The spiking times are the relevant information.
Microscopic modelling
Time point processes = random countable sets of times (points of R or R+).
N is a random countable set of points of R (or R+) locally finite a.s.
Denote · · ·<T−1 <T0 ≤ 0<T1 < .. . the ordered sequence of points of N.
N(A) = number of points of N in A.
Point measure: N(dt) = ∑i∈Z δTi(dt). Hence,
∫f (t)N(dt) = ∑i∈Z f (Ti ).
![Page 11: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/11.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
Modelisation
Age process
Age = delay since last spike.
Microscopic age
We consider the continuous to the left (hence predictable) version of theage.
The age at time 0 depends on the spiking times before time 0.
The dynamic is characterized by the spiking times after time 0.
![Page 12: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/12.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
Modelisation
Age process
Age = delay since last spike.
Microscopic age
We consider the continuous to the left (hence predictable) version of theage.
The age at time 0 depends on the spiking times before time 0.
The dynamic is characterized by the spiking times after time 0.
![Page 13: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/13.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
Outline
1 Introduction
2 Point processOverviewExamples of point processesThinning
3 Microscopic measure
4 Expectation measure
5 Coming back to our examples
![Page 14: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/14.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
Overview
Framework
Dichotomy of the behaviour of N with respect to time 0:
N− = N ∩ (−∞,0] is a point process with distribution P0 (initial condition).The age at time 0 is finite ⇔ N− 6= /0.
N+ = N ∩ (0,+∞) is a point process admitting some intensity λ (t,FNt−).
Stochastic intensity
Local behaviour: probability to find a new point.
May depend on the past (e.g. refractory period).
Heuristically,
λ (t,FNt−) = lim
∆t→0
1∆t
P(N ([t,t + ∆t]) = 1 |FN
t−
),
where FNt− denotes the history of N before time t.
λ L1loc a.s. ⇔ N locally finite a.s. (classic assumption).
p(s,X (t)) and λ (t,FNt−) are analogous.
![Page 15: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/15.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
Overview
Framework
Dichotomy of the behaviour of N with respect to time 0:
N− = N ∩ (−∞,0] is a point process with distribution P0 (initial condition).The age at time 0 is finite ⇔ N− 6= /0.
N+ = N ∩ (0,+∞) is a point process admitting some intensity λ (t,FNt−).
Stochastic intensity
Local behaviour: probability to find a new point.
May depend on the past (e.g. refractory period).
Heuristically,
λ (t,FNt−) = lim
∆t→0
1∆t
P(N ([t,t + ∆t]) = 1 |FN
t−
),
where FNt− denotes the history of N before time t.
λ L1loc a.s. ⇔ N locally finite a.s. (classic assumption).
p(s,X (t)) and λ (t,FNt−) are analogous.
![Page 16: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/16.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
Overview
Framework
Dichotomy of the behaviour of N with respect to time 0:
N− = N ∩ (−∞,0] is a point process with distribution P0 (initial condition).The age at time 0 is finite ⇔ N− 6= /0.
N+ = N ∩ (0,+∞) is a point process admitting some intensity λ (t,FNt−).
Stochastic intensity
Local behaviour: probability to find a new point.
May depend on the past (e.g. refractory period).
Heuristically,
λ (t,FNt−) = lim
∆t→0
1∆t
P(N ([t,t + ∆t]) = 1 |FN
t−
),
where FNt− denotes the history of N before time t.
λ L1loc a.s. ⇔ N locally finite a.s. (classic assumption).
p(s,X (t)) and λ (t,FNt−) are analogous.
![Page 17: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/17.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
Overview
Framework
Dichotomy of the behaviour of N with respect to time 0:
N− = N ∩ (−∞,0] is a point process with distribution P0 (initial condition).The age at time 0 is finite ⇔ N− 6= /0.
N+ = N ∩ (0,+∞) is a point process admitting some intensity λ (t,FNt−).
Stochastic intensity
Local behaviour: probability to find a new point.
May depend on the past (e.g. refractory period).
Heuristically,
λ (t,FNt−) = lim
∆t→0
1∆t
P(N ([t,t + ∆t]) = 1 |FN
t−
),
where FNt− denotes the history of N before time t.
λ L1loc a.s. ⇔ N locally finite a.s. (classic assumption).
p(s,X (t)) and λ (t,FNt−) are analogous.
![Page 18: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/18.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
Overview
Framework
Dichotomy of the behaviour of N with respect to time 0:
N− = N ∩ (−∞,0] is a point process with distribution P0 (initial condition).The age at time 0 is finite ⇔ N− 6= /0.
N+ = N ∩ (0,+∞) is a point process admitting some intensity λ (t,FNt−).
Stochastic intensity
Local behaviour: probability to find a new point.
May depend on the past (e.g. refractory period).
Heuristically,
λ (t,FNt−) = lim
∆t→0
1∆t
P(N ([t,t + ∆t]) = 1 |FN
t−
),
where FNt− denotes the history of N before time t.
λ L1loc a.s. ⇔ N locally finite a.s. (classic assumption).
p(s,X (t)) and λ (t,FNt−) are analogous.
![Page 19: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/19.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
Overview
Framework
Dichotomy of the behaviour of N with respect to time 0:
N− = N ∩ (−∞,0] is a point process with distribution P0 (initial condition).The age at time 0 is finite ⇔ N− 6= /0.
N+ = N ∩ (0,+∞) is a point process admitting some intensity λ (t,FNt−).
Stochastic intensity
Local behaviour: probability to find a new point.
May depend on the past (e.g. refractory period).
Heuristically,
λ (t,FNt−) = lim
∆t→0
1∆t
P(N ([t,t + ∆t]) = 1 |FN
t−
),
where FNt− denotes the history of N before time t.
λ L1loc a.s. ⇔ N locally finite a.s. (classic assumption).
p(s,X (t)) and λ (t,FNt−) are analogous.
![Page 20: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/20.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
Overview
Framework
Dichotomy of the behaviour of N with respect to time 0:
N− = N ∩ (−∞,0] is a point process with distribution P0 (initial condition).The age at time 0 is finite ⇔ N− 6= /0.
N+ = N ∩ (0,+∞) is a point process admitting some intensity λ (t,FNt−).
Stochastic intensity
Local behaviour: probability to find a new point.
May depend on the past (e.g. refractory period).
Heuristically,
λ (t,FNt−) = lim
∆t→0
1∆t
P(N ([t,t + ∆t]) = 1 |FN
t−
),
where FNt− denotes the history of N before time t.
λ L1loc a.s. ⇔ N locally finite a.s. (classic assumption).
p(s,X (t)) and λ (t,FNt−) are analogous.
![Page 21: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/21.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
Overview
Framework
Dichotomy of the behaviour of N with respect to time 0:
N− = N ∩ (−∞,0] is a point process with distribution P0 (initial condition).The age at time 0 is finite ⇔ N− 6= /0.
N+ = N ∩ (0,+∞) is a point process admitting some intensity λ (t,FNt−).
Stochastic intensity
Local behaviour: probability to find a new point.
May depend on the past (e.g. refractory period).
Heuristically,
λ (t,FNt−) = lim
∆t→0
1∆t
P(N ([t,t + ∆t]) = 1 |FN
t−
),
where FNt− denotes the history of N before time t.
λ L1loc a.s. ⇔ N locally finite a.s. (classic assumption).
p(s,X (t)) and λ (t,FNt−) are analogous.
![Page 22: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/22.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
Examples of point processes
Some classical point processes in neuroscience
Poisson process: λ (t,FNt−) = λ (t) = deterministic function.
Renewal process: λ (t,FNt−) = f (St−) ⇔ i.i.d. ISIs.
Hawkes process: λ (t,FNt−) = µ +
∫ t−
−∞
h(t−v)N(dv)
.
h ≥ 0
= µ + ∑V∈NV<t
h(t−V ).
We use the SDE representation of these processes induced by Thinning.
![Page 23: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/23.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
Examples of point processes
Some classical point processes in neuroscience
Poisson process: λ (t,FNt−) = λ (t) = deterministic function.
Renewal process: λ (t,FNt−) = f (St−) ⇔ i.i.d. ISIs.
Hawkes process: λ (t,FNt−) = µ +
∫ t−
−∞
h(t−v)N(dv)
.
h ≥ 0
= µ + ∑V∈NV<t
h(t−V ).
We use the SDE representation of these processes induced by Thinning.
![Page 24: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/24.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
Examples of point processes
Some classical point processes in neuroscience
Poisson process: λ (t,FNt−) = λ (t) = deterministic function.
Renewal process: λ (t,FNt−) = f (St−) ⇔ i.i.d. ISIs.
Hawkes process: λ (t,FNt−) = µ +
∫ t−
−∞
h(t−v)N(dv). h ≥ 0
= µ + ∑V∈NV<t
h(t−V ).
We use the SDE representation of these processes induced by Thinning.
![Page 25: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/25.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
Examples of point processes
Some classical point processes in neuroscience
Poisson process: λ (t,FNt−) = λ (t) = deterministic function.
Renewal process: λ (t,FNt−) = f (St−) ⇔ i.i.d. ISIs.
Hawkes process: λ (t,FNt−) = µ +
∫ t−
−∞
h(t−v)N(dv)
.
h ≥ 0
= µ + ∑V∈NV<t
h(t−V ).
∫ t−
−∞
h(t−v)N(dv) ←→∫ t
0d(v)m(t−v)dv = X (t).
We use the SDE representation of these processes induced by Thinning.
![Page 26: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/26.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
Examples of point processes
Some classical point processes in neuroscience
Poisson process: λ (t,FNt−) = λ (t) = deterministic function.
Renewal process: λ (t,FNt−) = f (St−) ⇔ i.i.d. ISIs.
Hawkes process: λ (t,FNt−) = µ +
∫ t−
−∞
h(t−v)N(dv)
.
h ≥ 0
= µ + ∑V∈NV<t
h(t−V ).
We use the SDE representation of these processes induced by Thinning.
![Page 27: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/27.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
Thinning
Lewis and Shedler’s Thinning, 1979
: Poisson process
: Poisson process
t0
NΠ is a Poisson processwith intensity 1.
Π(dt,dx) = ∑δX.
E [Π(dt,dx)] = dtdx .
Spatial independence.
λ is deterministic.
N admits λ as anintensity.
![Page 28: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/28.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
Thinning
Lewis and Shedler’s Thinning, 1979
t0
: Poisson process
: Poisson processNΠ is a Poisson processwith intensity 1.
Π(dt,dx) = ∑δX.
E [Π(dt,dx)] = dtdx .
Spatial independence.
λ is deterministic.
N admits λ as anintensity.
![Page 29: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/29.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
Thinning
Ogata’s Thinning, 1981
t0
: Poisson process
: Point process NΠ is a Poisson processwith intensity 1.
Π(dt,dx) = ∑δX.
E [Π(dt,dx)] = dtdx .
Spatial independence.
λ is random.
N admits λ as anintensity.
![Page 30: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/30.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
Thinning
Ogata’s Thinning, 1981
t0
: Poisson process
: Point process NΠ is a Poisson processwith intensity 1.
Π(dt,dx) = ∑δX.
E [Π(dt,dx)] = dtdx .
Spatial independence.
λ is random.
N admits λ as anintensity.
![Page 31: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/31.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
Thinning
Ogata’s Thinning, 1981
t0
: Poisson process
: Point process NΠ is a Poisson processwith intensity 1.
Π(dt,dx) = ∑δX.
E [Π(dt,dx)] = dtdx .
Spatial independence.
λ is random.
N admits λ as anintensity.
![Page 32: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/32.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
Thinning
Ogata’s Thinning, 1981
t0
: Poisson process
: Point process NΠ is a Poisson processwith intensity 1.
Π(dt,dx) = ∑δX.
E [Π(dt,dx)] = dtdx .
Spatial independence.
λ is random.
N admits λ as anintensity.
![Page 33: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/33.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
Thinning
Ogata’s Thinning, 1981
t0
: Poisson process
: Point process NΠ is a Poisson processwith intensity 1.
Π(dt,dx) = ∑δX.
E [Π(dt,dx)] = dtdx .
Spatial independence.
λ is random.
N admits λ as anintensity.
![Page 34: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/34.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
Thinning
Thinning
Theorem
Let Π be a (Ft)-Poisson process with intensity 1 on R2+. Let λ (t,Ft−) be a
non-negative (Ft)-predictable process which is L1loc a.s. and define the point
process N+ (on (0,∞)) by
N+ (C) =∫C×R+
1[0,λ(t,Ft−)] (x) Π(dt,dx) ,
for all C ∈B (R+). Then N+ admits λ (t,Ft−) as a (Ft)-predictable intensity.
Simulation.
Hawkes process: stationarity.(P. Brémaud, L. Massoulié, ’96)
Hawkes process: mean field limit.(S. Delattre et al., ’14)
What you should remind
N+(dt) =∫
λ(t,FNt−)
x=0Π(dt,dx) .
![Page 35: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/35.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
Thinning
Thinning
Theorem
Let Π be a (Ft)-Poisson process with intensity 1 on R2+. Let λ (t,Ft−) be a
non-negative (Ft)-predictable process which is L1loc a.s. and define the point
process N+ (on (0,∞)) by
N+ (C) =∫C×R+
1[0,λ(t,Ft−)] (x) Π(dt,dx) ,
for all C ∈B (R+). Then N+ admits λ (t,Ft−) as a (Ft)-predictable intensity.
Simulation.
Hawkes process: stationarity.(P. Brémaud, L. Massoulié, ’96)
Hawkes process: mean field limit.(S. Delattre et al., ’14)
What you should remind
N+(dt) =∫
λ(t,FNt−)
x=0Π(dt,dx) .
![Page 36: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/36.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
Thinning
Thinning
Theorem
Let Π be a (Ft)-Poisson process with intensity 1 on R2+. Let λ (t,Ft−) be a
non-negative (Ft)-predictable process which is L1loc a.s. and define the point
process N+ (on (0,∞)) by
N+ (C) =∫C×R+
1[0,λ(t,Ft−)] (x) Π(dt,dx) ,
for all C ∈B (R+). Then N+ admits λ (t,Ft−) as a (Ft)-predictable intensity.
Simulation.
Hawkes process: stationarity.(P. Brémaud, L. Massoulié, ’96)
Hawkes process: mean field limit.(S. Delattre et al., ’14)
What you should remind
N+(dt) =∫
λ(t,FNt−)
x=0Π(dt,dx) .
![Page 37: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/37.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
Thinning
Thinning
Theorem
Let Π be a (Ft)-Poisson process with intensity 1 on R2+. Let λ (t,Ft−) be a
non-negative (Ft)-predictable process which is L1loc a.s. and define the point
process N+ (on (0,∞)) by
N+ (C) =∫C×R+
1[0,λ(t,Ft−)] (x) Π(dt,dx) ,
for all C ∈B (R+). Then N+ admits λ (t,Ft−) as a (Ft)-predictable intensity.
Simulation.
Hawkes process: stationarity.(P. Brémaud, L. Massoulié, ’96)
Hawkes process: mean field limit.(S. Delattre et al., ’14)
What you should remind
N+(dt) =∫
λ(t,FNt−)
x=0Π(dt,dx) .
![Page 38: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/38.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
Thinning
Thinning
Theorem
Let Π be a (Ft)-Poisson process with intensity 1 on R2+. Let λ (t,Ft−) be a
non-negative (Ft)-predictable process which is L1loc a.s. and define the point
process N+ (on (0,∞)) by
N+ (C) =∫C×R+
1[0,λ(t,Ft−)] (x) Π(dt,dx) ,
for all C ∈B (R+). Then N+ admits λ (t,Ft−) as a (Ft)-predictable intensity.
Simulation.
Hawkes process: stationarity.(P. Brémaud, L. Massoulié, ’96)
Hawkes process: mean field limit.(S. Delattre et al., ’14)
What you should remind
N+(dt) =∫
λ(t,FNt−)
x=0Π(dt,dx) .
![Page 39: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/39.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
Outline
1 Introduction
2 Point process
3 Microscopic measureTechnical constructionThe system
4 Expectation measure
5 Coming back to our examples
![Page 40: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/40.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
Technical construction
A microscopic analogous to n
n(t, .) is the probability density of the age at time t.
At fixed time t, we are looking at a Dirac mass at St−.
What we need
Random measure U on R2.
Action over test functions: ∀ϕ ∈ C∞c,b(R2
+),∫ϕ(t,s)U(dt,ds) =
∫ϕ(t,St−)dt.
What we define
We construct an ad hoc random measure U which satisfies a system ofstochastic differential equations similar to (PPS).
![Page 41: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/41.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
Technical construction
A microscopic analogous to n
n(t, .) is the probability density of the age at time t.
At fixed time t, we are looking at a Dirac mass at St−.
What we need
Random measure U on R2.
Action over test functions: ∀ϕ ∈ C∞c,b(R2
+),∫ϕ(t,s)U(dt,ds) =
∫ϕ(t,St−)dt.
What we define
We construct an ad hoc random measure U which satisfies a system ofstochastic differential equations similar to (PPS).
![Page 42: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/42.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
The system
Random system
Theorem
Let Π be a Poisson measure. Let(λ(t,FN
t−))t>0 be some non negative predictable
process which is L1loc a.s.
The measure U satisfies the following system a.s.(∂t + ∂s){U (dt,ds)}+
(∫λ(t,FN
t−)
x=0Π(dt,dx)
)U (t,ds) = 0,
U (dt,0) =∫s∈R
(∫λ(t,FN
t−)
x=0Π(dt,dx)
)U (t,ds) ,
in the weak sense with initial condition limt→0+ U(t, ·) = δ−T0 . (−T0 is the age attime 0)
Technical difficulty
Product of measures
Parametrized families of measures U(t,ds) and U(dt,s), e.g.
U(t,ds) = δSt−(ds)
Fubini property: U(t,ds)dt = U(dt,s)ds = U(dt,ds).
![Page 43: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/43.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
The system
Random system
TheoremThe measure U satisfies the following system a.s.
(∂t + ∂s){U (dt,ds)}+
(∫λ(t,FN
t−)
x=0Π(dt,dx)
)U (t,ds) = 0,
U (dt,0) =∫s∈R
(∫λ(t,FN
t−)
x=0Π(dt,dx)
)U (t,ds) ,
in the weak sense with initial condition limt→0+ U(t, ·) = δ−T0 . (−T0 is the age attime 0)
p(s,X (t)) is replaced by∫
λ(t,FNt−)
x=0Π(dt,dx).
E
[∫λ(t,FN
t−)
x=0Π(dt,dx)
∣∣∣∣∣FNt−
]= λ
(t,FN
t−
)dt.
Technical difficulty
Product of measures
Parametrized families of measures U(t,ds) and U(dt,s), e.g.
U(t,ds) = δSt−(ds)
Fubini property: U(t,ds)dt = U(dt,s)ds = U(dt,ds).
![Page 44: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/44.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
The system
Random system
TheoremThe measure U satisfies the following system a.s.
(∂t + ∂s){U (dt,ds)}+
(∫λ(t,FN
t−)
x=0Π(dt,dx)
)U (t,ds) = 0,
U (dt,0) =∫s∈R
(∫λ(t,FN
t−)
x=0Π(dt,dx)
)U (t,ds) ,
in the weak sense with initial condition limt→0+ U(t, ·) = δ−T0 . (−T0 is the age attime 0)
p(s,X (t)) is replaced by∫
λ(t,FNt−)
x=0Π(dt,dx).
E
[∫λ(t,FN
t−)
x=0Π(dt,dx)
∣∣∣∣∣FNt−
]= λ
(t,FN
t−
)dt.
Technical difficulty
Product of measures
Parametrized families of measures U(t,ds) and U(dt,s), e.g.
U(t,ds) = δSt−(ds)
Fubini property: U(t,ds)dt = U(dt,s)ds = U(dt,ds).
![Page 45: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/45.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
The system
Random system
TheoremThe measure U satisfies the following system a.s.
(∂t + ∂s){U (dt,ds)}+
(∫λ(t,FN
t−)
x=0Π(dt,dx)
)U (t,ds) = 0,
U (dt,0) =∫s∈R
(∫λ(t,FN
t−)
x=0Π(dt,dx)
)U (t,ds) ,
in the weak sense with initial condition limt→0+ U(t, ·) = δ−T0 . (−T0 is the age attime 0)
Technical difficulty
Product of measures
Parametrized families of measures U(t,ds) and U(dt,s), e.g.
U(t,ds) = δSt−(ds)
Fubini property: U(t,ds)dt = U(dt,s)ds = U(dt,ds).
![Page 46: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/46.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
The system
Random system
TheoremThe measure U satisfies the following system a.s.
(∂t + ∂s){U (dt,ds)}+
(∫λ(t,FN
t−)
x=0Π(dt,dx)
)U (t,ds) = 0,
U (dt,0) =∫s∈R
(∫λ(t,FN
t−)
x=0Π(dt,dx)
)U (t,ds) ,
in the weak sense with initial condition limt→0+ U(t, ·) = δ−T0 . (−T0 is the age attime 0)
Technical difficulty
Product of measures
Parametrized families of measures U(t,ds) and U(dt,s), e.g.
U(t,ds) = δSt−(ds)
Fubini property: U(t,ds)dt = U(dt,s)ds = U(dt,ds).
![Page 47: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/47.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
The system
Random system
TheoremThe measure U satisfies the following system a.s.
(∂t + ∂s){U (dt,ds)}+
(∫λ(t,FN
t−)
x=0Π(dt,dx)
)U (t,ds) = 0,
U (dt,0) =∫s∈R
(∫λ(t,FN
t−)
x=0Π(dt,dx)
)U (t,ds) ,
in the weak sense with initial condition limt→0+ U(t, ·) = δ−T0 . (−T0 is the age attime 0)
Technical difficulty
Product of measures
Parametrized families of measures U(t,ds) and U(dt,s), e.g.
U(t,ds) = δSt−(ds)
Fubini property: U(t,ds)dt = U(dt,s)ds = U(dt,ds).
![Page 48: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/48.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
Outline
1 Introduction
2 Point process
3 Microscopic measure
4 Expectation measureTechnical constructionThe systemPopulation-based version
5 Coming back to our examples
![Page 49: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/49.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
Technical construction
Taking the expectation
Can we consider the expectation measure u (dt,ds) = E [U (dt,ds)] ?
Definition ∫ϕ(t,s)u(t,ds) = E
[∫ϕ(t,s)U(t,ds)
],
∫ϕ(t,s)u(dt,s) = E
[∫ϕ(t,s)U(dt,s)
].
Stronger assumption
We need the intensity to be L1loc in expectation.
Property
![Page 50: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/50.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
Technical construction
Taking the expectation
Can we consider the expectation measure u (dt,ds) = E [U (dt,ds)] ?
Definition ∫ϕ(t,s)u(t,ds) = E
[∫ϕ(t,s)U(t,ds)
],
∫ϕ(t,s)u(dt,s) = E
[∫ϕ(t,s)U(dt,s)
].
Stronger assumption
We need the intensity to be L1loc in expectation.
Property
![Page 51: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/51.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
Technical construction
Taking the expectation
Can we consider the expectation measure u (dt,ds) = E [U (dt,ds)] ?
Definition ∫ϕ(t,s)u(t,ds) = E
[∫ϕ(t,s)U(t,ds)
],
∫ϕ(t,s)u(dt,s) = E
[∫ϕ(t,s)U(dt,s)
].
Stronger assumption
We need the intensity to be L1loc in expectation.
Property
![Page 52: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/52.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
Technical construction
Taking the expectation
Can we consider the expectation measure u (dt,ds) = E [U (dt,ds)] ?
Definition ∫ϕ(t,s)u(t,ds) = E
[∫ϕ(t,s)U(t,ds)
],
∫ϕ(t,s)u(dt,s) = E
[∫ϕ(t,s)U(dt,s)
].
Stronger assumption
We need the intensity to be L1loc in expectation.
Property
U(t,ds) = δSt−(ds).
![Page 53: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/53.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
Technical construction
Taking the expectation
Can we consider the expectation measure u (dt,ds) = E [U (dt,ds)] ?
Definition ∫ϕ(t,s)u(t,ds) = E
[∫ϕ(t,s)U(t,ds)
],
∫ϕ(t,s)u(dt,s) = E
[∫ϕ(t,s)U(dt,s)
].
Stronger assumption
We need the intensity to be L1loc in expectation.
Property
u(t, .) is the distribution of St−.
![Page 54: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/54.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
The system
System in expectation
Theorem
Let(λ (t,FN
t−))t>0 be some non negative predictable process which is
L1loc a.s.
The measure U satisfies the following system,(∂t + ∂s){U (dt,ds)}+
(∫λ(t,FN
t−)
x=0Π(dt,dx)
)U (t,ds) = 0,
U (dt,0) =∫s∈R
(∫λ(t,FN
t−)
x=0Π(dt,dx)
)U (t,ds) ,
in the weak sense with initial condition limt→0+ U(t, ·) = δ−T0 .
There are two (highly correlated) random measures: U and Π.
![Page 55: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/55.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
The system
System in expectation
Theorem
Let(λ (t,FN
t−))t>0 be some non negative predictable process which is
L1loc in expectation, and which admits a finite mean.
The measure u satisfies the following system,
(∂t + ∂s)u (dt,ds) + ρλ ,P0 (t,s)u (dt,ds) = 0,
u (dt,0) =∫s∈R
ρλ ,P0 (t,s)u (t,ds)dt,
in the weak sense where ρλ ,P0 (t,s) = E[λ(t,FN
t−)∣∣St− = s
]for almost
every t. The initial condition limt→0+ u (t, ·) is given by the distributionof −T0.
There are two (highly correlated) random measures: U and Π.
![Page 56: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/56.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
The system
Idea of Proof
We deal with the equations in the weak sense.
The terms that do not involve the spiking measure∫
λ(t,FNt−)
x=0Π(dt,dx) are
easy to deal with.
E
[
∫t
(∫λ(t,FN
t−)
x=0Π(dt,dx)
)
]= E
[∫t
ϕ (t,St−) dt
]=
∫t
∫s
ϕ(t,s)ρλ ,P0 (t,s)u(t,ds)dt.
![Page 57: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/57.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
The system
Idea of Proof
We deal with the equations in the weak sense.
The terms that do not involve the spiking measure∫
λ(t,FNt−)
x=0Π(dt,dx) are
easy to deal with.
E
[
∫t
∫s
ϕ (t,s)U (t,ds)
(∫λ(t,FN
t−)
x=0Π(dt,dx)
)
]= E
[∫t
ϕ (t,St−) dt
]=
∫t
∫s
ϕ(t,s)ρλ ,P0 (t,s)u(t,ds)dt.
![Page 58: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/58.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
The system
Idea of Proof
We deal with the equations in the weak sense.
The terms that do not involve the spiking measure∫
λ(t,FNt−)
x=0Π(dt,dx) are
easy to deal with.
E
[
∫t
∫s
ϕ (t,s)U (t,ds)
(∫λ(t,FN
t−)
x=0Π(dt,dx)
)
]= E
[∫t
ϕ (t,St−) dt
]=
∫t
∫s
ϕ(t,s)ρλ ,P0 (t,s)u(t,ds)dt.
![Page 59: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/59.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
The system
Idea of Proof
We deal with the equations in the weak sense.
The terms that do not involve the spiking measure∫
λ(t,FNt−)
x=0Π(dt,dx) are
easy to deal with.
E
[
∫t
ϕ (t,St−)
(∫λ(t,FN
t−)
x=0Π(dt,dx)
)
]= E
[∫t
ϕ (t,St−) dt
]=
∫t
∫s
ϕ(t,s)ρλ ,P0 (t,s)u(t,ds)dt.
![Page 60: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/60.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
The system
Idea of Proof
We deal with the equations in the weak sense.
The terms that do not involve the spiking measure∫
λ(t,FNt−)
x=0Π(dt,dx) are
easy to deal with.
E
[∫t
ϕ (t,St−)
(∫λ(t,FN
t−)
x=0Π(dt,dx)
)]
= E[∫
tϕ (t,St−) dt
]=
∫t
∫s
ϕ(t,s)ρλ ,P0 (t,s)u(t,ds)dt.
![Page 61: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/61.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
The system
Idea of Proof
We deal with the equations in the weak sense.
The terms that do not involve the spiking measure∫
λ(t,FNt−)
x=0Π(dt,dx) are
easy to deal with.
E
[∫t
ϕ (t,St−)λ
(t,FN
t−
)dt
]
= E[∫
tϕ (t,St−) dt
]=
∫t
∫s
ϕ(t,s)ρλ ,P0 (t,s)u(t,ds)dt.
![Page 62: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/62.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
The system
Idea of Proof
We deal with the equations in the weak sense.
The terms that do not involve the spiking measure∫
λ(t,FNt−)
x=0Π(dt,dx) are
easy to deal with.
E
[∫t
ϕ (t,St−)λ
(t,FN
t−
)dt
]= E
[∫t
ϕ (t,St−)E[λ
(t,FN
t−
)∣∣∣St−]dt]
=∫t
∫s
ϕ(t,s)ρλ ,P0 (t,s)u(t,ds)dt.
![Page 63: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/63.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
The system
Idea of Proof
We deal with the equations in the weak sense.
The terms that do not involve the spiking measure∫
λ(t,FNt−)
x=0Π(dt,dx) are
easy to deal with.
E
[∫t
ϕ (t,St−)λ
(t,FN
t−
)dt
]= E
[∫t
ϕ (t,St−) ρλ ,P0 (t,St−) dt
]
=∫t
∫s
ϕ(t,s)ρλ ,P0 (t,s)u(t,ds)dt.
![Page 64: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/64.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
The system
Idea of Proof
We deal with the equations in the weak sense.
The terms that do not involve the spiking measure∫
λ(t,FNt−)
x=0Π(dt,dx) are
easy to deal with.
E
[∫t
ϕ (t,St−)λ
(t,FN
t−
)dt
]= E
[∫t
ϕ (t,St−) ρλ ,P0 (t,St−) dt
]=
∫t
∫s
ϕ(t,s)ρλ ,P0 (t,s)u(t,ds)dt.
![Page 65: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/65.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
Population-based version
Law of large numbers
Law of large numbers.
Population-based approach.
Theorem
Let (N i )i≥1 be some i.i.d. point processes on R with L1loc intensity in
expectation. For each i , let(S it−)t>0 denote the age process associated to N i .
Then, for every test function ϕ,
∫ϕ(t,s)
(1n
n
∑i=1
δS it−
(ds)
)dt
a.s.−−−→n→∞
∫ϕ(t,s)u(dt,ds),
with u satisfying the deterministic system.
Idea of proof
Thinning inversion Theorem ⇒ recover some Poisson measures (Πi )i≥1 andmicroscopic measures (U i )i≥1.
![Page 66: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/66.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
Population-based version
Law of large numbers
Law of large numbers.
Population-based approach.
Theorem
Let (N i )i≥1 be some i.i.d. point processes on R with L1loc intensity in
expectation. For each i , let(S it−)t>0 denote the age process associated to N i .
Then, for every test function ϕ,
∫ϕ(t,s)
(1n
n
∑i=1
δS it−
(ds)
)dt
a.s.−−−→n→∞
∫ϕ(t,s)u(dt,ds),
with u satisfying the deterministic system.
Idea of proof
Thinning inversion Theorem ⇒ recover some Poisson measures (Πi )i≥1 andmicroscopic measures (U i )i≥1.
![Page 67: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/67.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
Population-based version
Law of large numbers
Law of large numbers.
Population-based approach.
Theorem
Let (N i )i≥1 be some i.i.d. point processes on R with L1loc intensity in
expectation. For each i , let(S it−)t>0 denote the age process associated to N i .
Then, for every test function ϕ,
∫ϕ(t,s)
(1n
n
∑i=1
δS it−
(ds)
)dt
a.s.−−−→n→∞
∫ϕ(t,s)u(dt,ds),
with u satisfying the deterministic system.
Idea of proof
Thinning inversion Theorem ⇒ recover some Poisson measures (Πi )i≥1 andmicroscopic measures (U i )i≥1.
![Page 68: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/68.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
Population-based version
Law of large numbers
Law of large numbers.
Population-based approach.
Theorem
Let (N i )i≥1 be some i.i.d. point processes on R with L1loc intensity in
expectation. For each i , let(S it−)t>0 denote the age process associated to N i .
Then, for every test function ϕ,
∫ϕ(t,s)
(1n
n
∑i=1
δS it−
(ds)
)dt
a.s.−−−→n→∞
∫ϕ(t,s)u(dt,ds),
with u satisfying the deterministic system.
Idea of proof
Thinning inversion Theorem ⇒ recover some Poisson measures (Πi )i≥1 andmicroscopic measures (U i )i≥1.
![Page 69: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/69.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
Outline
1 Introduction
2 Point process
3 Microscopic measure
4 Expectation measure
5 Coming back to our examplesDirect applicationLinear Hawkes process
![Page 70: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/70.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
Direct application
Review of the examples
The system in expectation(∂t + ∂s)u (dt,ds) + ρλ ,P0 (t,s)u (dt,ds) = 0,
u (dt,0) =∫s∈R
ρλ ,P0 (t,s)u (t,ds) dt.
where ρλ ,P0 (t,s) = E[λ(t,FN
t−)∣∣St− = s
].
This result may seem OK to a probabilist,
But analysts need some explicit expression for ρ.
In particular, this system may seem linear, but it is non-linear in general.
Poisson process.
Renewal process.
Hawkes process.
→ ρλ ,P0 (t,s) = f (t).
→ ρλ ,P0 (t,s) = f (s).
→ ρλ ,P0 is much more complex.
![Page 71: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/71.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
Direct application
Review of the examples
The system in expectation(∂t + ∂s)u (dt,ds) + ρλ ,P0 (t,s)u (dt,ds) = 0,
u (dt,0) =∫s∈R
ρλ ,P0 (t,s)u (t,ds) dt.
where ρλ ,P0 (t,s) = E[λ(t,FN
t−)∣∣St− = s
].
This result may seem OK to a probabilist,
But analysts need some explicit expression for ρ.
In particular, this system may seem linear, but it is non-linear in general.
Poisson process.
Renewal process.
Hawkes process.
→ ρλ ,P0 (t,s) = f (t).
→ ρλ ,P0 (t,s) = f (s).
→ ρλ ,P0 is much more complex.
![Page 72: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/72.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
Direct application
Review of the examples
The system in expectation(∂t + ∂s)u (dt,ds) + ρλ ,P0 (t,s)u (dt,ds) = 0,
u (dt,0) =∫s∈R
ρλ ,P0 (t,s)u (t,ds) dt.
where ρλ ,P0 (t,s) = E[λ(t,FN
t−)∣∣St− = s
].
This result may seem OK to a probabilist,
But analysts need some explicit expression for ρ.
In particular, this system may seem linear, but it is non-linear in general.
Poisson process.
Renewal process.
Hawkes process.
→ ρλ ,P0 (t,s) = f (t).
→ ρλ ,P0 (t,s) = f (s).
→ ρλ ,P0 is much more complex.
![Page 73: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/73.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
Direct application
Review of the examples
The system in expectation(∂t + ∂s)u (dt,ds) + ρλ ,P0 (t,s)u (dt,ds) = 0,
u (dt,0) =∫s∈R
ρλ ,P0 (t,s)u (t,ds) dt.
where ρλ ,P0 (t,s) = E[λ(t,FN
t−)∣∣St− = s
].
This result may seem OK to a probabilist,
But analysts need some explicit expression for ρ.
In particular, this system may seem linear, but it is non-linear in general.
Poisson process.
Renewal process.
Hawkes process.
→ ρλ ,P0 (t,s) = f (t).
→ ρλ ,P0 (t,s) = f (s).
→ ρλ ,P0 is much more complex.
![Page 74: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/74.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
Direct application
Review of the examples
The system in expectation(∂t + ∂s)u (dt,ds) + ρλ ,P0 (t,s)u (dt,ds) = 0,
u (dt,0) =∫s∈R
ρλ ,P0 (t,s)u (t,ds) dt.
where ρλ ,P0 (t,s) = E[λ(t,FN
t−)∣∣St− = s
].
This result may seem OK to a probabilist,
But analysts need some explicit expression for ρ.
In particular, this system may seem linear, but it is non-linear in general.
Poisson process.
Renewal process.
Hawkes process.
→ ρλ ,P0 (t,s) = f (t).
→ ρλ ,P0 (t,s) = f (s).
→ ρλ ,P0 is much more complex.
![Page 75: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/75.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
Linear Hawkes process
Overview of the results
Recall that ∫ t−
−∞
h(t−x)N(dx) ←→∫ t
0d(x)m(t−x)dx = X (t).
What we expected
Replacement of p(s,X (t)) by
E[λ (t,FN
t−)]
= µ +∫ t
0h (t−x)u(dx ,0)←→ X (t)
What we find
p(s,X (t)) is replaced by ρλ ,P0 (t,s) which is the conditional expectation, notthe full expectation.
Technical difficulty
ρλ ,P0 (t,s) = E[λ(t,FN
t−)∣∣St− = s
]is not so easy to compute.
![Page 76: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/76.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
Linear Hawkes process
Overview of the results
Recall that ∫ t−
−∞
h(t−x)N(dx) ←→∫ t
0d(x)m(t−x)dx = X (t).
What we expected
Replacement of p(s,X (t)) by
E[λ (t,FN
t−)]
= µ +∫ t
0h (t−x)u(dx ,0)←→ X (t)
What we find
p(s,X (t)) is replaced by ρλ ,P0 (t,s) which is the conditional expectation, notthe full expectation.
Technical difficulty
ρλ ,P0 (t,s) = E[λ(t,FN
t−)∣∣St− = s
]is not so easy to compute.
![Page 77: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/77.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
Linear Hawkes process
Overview of the results
Recall that ∫ t−
−∞
h(t−x)N(dx) ←→∫ t
0d(x)m(t−x)dx = X (t).
What we expected
Replacement of p(s,X (t)) by
E[λ (t,FN
t−)]
= µ +∫ t
0h (t−x)u(dx ,0)←→ X (t)
What we find
p(s,X (t)) is replaced by ρλ ,P0 (t,s) which is the conditional expectation, notthe full expectation.
Technical difficulty
ρλ ,P0 (t,s) = E[λ(t,FN
t−)∣∣St− = s
]is not so easy to compute.
![Page 78: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/78.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
Linear Hawkes process
Overview of the results
Recall that ∫ t−
−∞
h(t−x)N(dx) ←→∫ t
0d(x)m(t−x)dx = X (t).
What we expected
Replacement of p(s,X (t)) by
E[λ (t,FN
t−)]
= µ +∫ t
0h (t−x)u(dx ,0)←→ X (t)
What we find
p(s,X (t)) is replaced by ρλ ,P0 (t,s) which is the conditional expectation, notthe full expectation.
Technical difficulty
ρλ ,P0 (t,s) = E[λ(t,FN
t−)∣∣St− = s
]is not so easy to compute.
![Page 79: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/79.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
Linear Hawkes process
Cluster process
λ (t,FNt−) = µ +
∫ t−
0h(t−v)N(dv) = µ + ∑
V∈NV<t
h(t−V )
0
0
Cluster process Nc associated to h: The set of points of generation greaterthan 1.
number of children ∼P (||h||1): ||h||1 < 1⇒ Nc is finite a.s.
![Page 80: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/80.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
Linear Hawkes process
Cluster process
λ (t,FNt−) = µ +
∫ t−
0h(t−v)N(dv) = µ + ∑
V∈NV<t
h(t−V )
0
0
Cluster process Nc associated to h: The set of points of generation greaterthan 1.
number of children ∼P (||h||1): ||h||1 < 1⇒ Nc is finite a.s.
![Page 81: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/81.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
Linear Hawkes process
Cluster process
λ (t,FNt−) = µ +
∫ t−
0h(t−v)N(dv) = µ + ∑
V∈NV<t
h(t−V )
0
01
1
1
1
Cluster process Nc associated to h: The set of points of generation greaterthan 1.
number of children ∼P (||h||1): ||h||1 < 1⇒ Nc is finite a.s.
![Page 82: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/82.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
Linear Hawkes process
Cluster process
λ (t,FNt−) = µ +
∫ t−
0h(t−v)N(dv) = µ + ∑
V∈NV<t
h(t−V )
0
01
1
1
1
Cluster process Nc associated to h: The set of points of generation greaterthan 1.
number of children ∼P (||h||1): ||h||1 < 1⇒ Nc is finite a.s.
![Page 83: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/83.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
Linear Hawkes process
Cluster process
λ (t,FNt−) = µ +
∫ t−
0h(t−v)N(dv) = µ + ∑
V∈NV<t
h(t−V )
0
01
1
1
1 2
2
2
2
Cluster process Nc associated to h: The set of points of generation greaterthan 1.
number of children ∼P (||h||1): ||h||1 < 1⇒ Nc is finite a.s.
![Page 84: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/84.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
Linear Hawkes process
Cluster process
λ (t,FNt−) = µ +
∫ t−
0h(t−v)N(dv) = µ + ∑
V∈NV<t
h(t−V )
0
01
1
1
1 2
2
2
23
3
Cluster process Nc associated to h: The set of points of generation greaterthan 1.
number of children ∼P (||h||1): ||h||1 < 1⇒ Nc is finite a.s.
![Page 85: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/85.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
Linear Hawkes process
Cluster process
λ (t,FNt−) = µ +
∫ t−
0h(t−v)N(dv) = µ + ∑
V∈NV<t
h(t−V )
0
01
1
1
1 2
2
2
23
3
Cluster process Nc associated to h: The set of points of generation greaterthan 1.
number of children ∼P (||h||1): ||h||1 < 1⇒ Nc is finite a.s.
![Page 86: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/86.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
Linear Hawkes process
Cluster process
λ (t,FNt−) = µ +
∫ t−
0h(t−v)N(dv) = µ + ∑
V∈NV<t
h(t−V )
0
01
1
1
1 2
2
2
23
3
Cluster process Nc associated to h: The set of points of generation greaterthan 1.
number of children ∼P (||h||1): ||h||1 < 1⇒ Nc is finite a.s.
![Page 87: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/87.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
Linear Hawkes process
Cluster decomposition of the linear Hawkes process
Recall that: N− = N ∩ (−∞,0] and N+ = N ∩ (0,+∞).
N− is a point process on R− distributed according to P0.
(NT
1)T∈N− is a sequence of independent Poisson processes with respective
intensities λT (v) = h(v −T )1(0,∞)(v).(NT ,Vc
)V∈NT
1 ,T∈N−is a sequence of independent cluster processes
associated to h.
N≤0 = N−∪
⋃T∈N−
NT1 ∪
⋃V∈NT
1
V +NT ,Vc
. (1)
The process N≤0 admits t 7→∫ t−−∞
h(t−x)N≤0(dx) as an intensity on (0,∞).
N>0 = Nanc ∪
( ⋃X∈Nanc
X +NXc
). (2)
The process N>0 admits t 7→ µ +∫ t−0 h(t−x)N>0(dx) as an intensity on (0,∞).
![Page 88: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/88.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
Linear Hawkes process
Cluster decomposition of the linear Hawkes process
Recall that: N− = N ∩ (−∞,0] and N+ = N ∩ (0,+∞).
N− is a point process on R− distributed according to P0.(NT
1)T∈N− is a sequence of independent Poisson processes with respective
intensities λT (v) = h(v −T )1(0,∞)(v).
(NT ,Vc
)V∈NT
1 ,T∈N−is a sequence of independent cluster processes
associated to h.
N≤0 = N−∪
⋃T∈N−
NT1 ∪
⋃V∈NT
1
V +NT ,Vc
. (1)
The process N≤0 admits t 7→∫ t−−∞
h(t−x)N≤0(dx) as an intensity on (0,∞).
N>0 = Nanc ∪
( ⋃X∈Nanc
X +NXc
). (2)
The process N>0 admits t 7→ µ +∫ t−0 h(t−x)N>0(dx) as an intensity on (0,∞).
![Page 89: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/89.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
Linear Hawkes process
Cluster decomposition of the linear Hawkes process
Recall that: N− = N ∩ (−∞,0] and N+ = N ∩ (0,+∞).
N− is a point process on R− distributed according to P0.(NT
1)T∈N− is a sequence of independent Poisson processes with respective
intensities λT (v) = h(v −T )1(0,∞)(v).(NT ,Vc
)V∈NT
1 ,T∈N−is a sequence of independent cluster processes
associated to h.
N≤0 = N−∪
⋃T∈N−
NT1 ∪
⋃V∈NT
1
V +NT ,Vc
. (1)
The process N≤0 admits t 7→∫ t−−∞
h(t−x)N≤0(dx) as an intensity on (0,∞).
N>0 = Nanc ∪
( ⋃X∈Nanc
X +NXc
). (2)
The process N>0 admits t 7→ µ +∫ t−0 h(t−x)N>0(dx) as an intensity on (0,∞).
![Page 90: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/90.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
Linear Hawkes process
Cluster decomposition of the linear Hawkes process
Recall that: N− = N ∩ (−∞,0] and N+ = N ∩ (0,+∞).
N− is a point process on R− distributed according to P0.(NT
1)T∈N− is a sequence of independent Poisson processes with respective
intensities λT (v) = h(v −T )1(0,∞)(v).(NT ,Vc
)V∈NT
1 ,T∈N−is a sequence of independent cluster processes
associated to h.
N≤0 = N−∪
⋃T∈N−
NT1 ∪
⋃V∈NT
1
V +NT ,Vc
. (1)
The process N≤0 admits t 7→∫ t−−∞
h(t−x)N≤0(dx) as an intensity on (0,∞).
N>0 = Nanc ∪
( ⋃X∈Nanc
X +NXc
). (2)
The process N>0 admits t 7→ µ +∫ t−0 h(t−x)N>0(dx) as an intensity on (0,∞).
![Page 91: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/91.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
Linear Hawkes process
Cluster decomposition of the linear Hawkes process
Recall that: N− = N ∩ (−∞,0] and N+ = N ∩ (0,+∞).
N≤0 = N−∪
⋃T∈N−
NT1 ∪
⋃V∈NT
1
V +NT ,Vc
. (1)
The process N≤0 admits t 7→∫ t−−∞
h(t−x)N≤0(dx) as an intensity on (0,∞).
µ is a positive constant.
Nanc is a Poisson process with intensity λ (t) = µ1(0,∞)(t).(NXc
)X∈Nanc
is a sequence of independent cluster processes associated to h.
N>0 = Nanc ∪
( ⋃X∈Nanc
X +NXc
). (2)
The process N>0 admits t 7→ µ +∫ t−0 h(t−x)N>0(dx) as an intensity on (0,∞).
![Page 92: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/92.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
Linear Hawkes process
Cluster decomposition of the linear Hawkes process
Recall that: N− = N ∩ (−∞,0] and N+ = N ∩ (0,+∞).
N≤0 = N−∪
⋃T∈N−
NT1 ∪
⋃V∈NT
1
V +NT ,Vc
. (1)
The process N≤0 admits t 7→∫ t−−∞
h(t−x)N≤0(dx) as an intensity on (0,∞).
µ is a positive constant.
Nanc is a Poisson process with intensity λ (t) = µ1(0,∞)(t).
(NXc
)X∈Nanc
is a sequence of independent cluster processes associated to h.
N>0 = Nanc ∪
( ⋃X∈Nanc
X +NXc
). (2)
The process N>0 admits t 7→ µ +∫ t−0 h(t−x)N>0(dx) as an intensity on (0,∞).
![Page 93: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/93.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
Linear Hawkes process
Cluster decomposition of the linear Hawkes process
Recall that: N− = N ∩ (−∞,0] and N+ = N ∩ (0,+∞).
N≤0 = N−∪
⋃T∈N−
NT1 ∪
⋃V∈NT
1
V +NT ,Vc
. (1)
The process N≤0 admits t 7→∫ t−−∞
h(t−x)N≤0(dx) as an intensity on (0,∞).
µ is a positive constant.
Nanc is a Poisson process with intensity λ (t) = µ1(0,∞)(t).(NXc
)X∈Nanc
is a sequence of independent cluster processes associated to h.
N>0 = Nanc ∪
( ⋃X∈Nanc
X +NXc
). (2)
The process N>0 admits t 7→ µ +∫ t−0 h(t−x)N>0(dx) as an intensity on (0,∞).
![Page 94: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/94.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
Linear Hawkes process
Cluster decomposition of the linear Hawkes process
Recall that: N− = N ∩ (−∞,0] and N+ = N ∩ (0,+∞).
N≤0 = N−∪
⋃T∈N−
NT1 ∪
⋃V∈NT
1
V +NT ,Vc
. (1)
The process N≤0 admits t 7→∫ t−−∞
h(t−x)N≤0(dx) as an intensity on (0,∞).
µ is a positive constant.
Nanc is a Poisson process with intensity λ (t) = µ1(0,∞)(t).(NXc
)X∈Nanc
is a sequence of independent cluster processes associated to h.
N>0 = Nanc ∪
( ⋃X∈Nanc
X +NXc
). (2)
The process N>0 admits t 7→ µ +∫ t−0 h(t−x)N>0(dx) as an intensity on (0,∞).
![Page 95: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/95.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
Linear Hawkes process
Cluster decomposition of the linear Hawkes process
N≤0 = N−∪
⋃T∈N−
NT1 ∪
⋃V∈NT
1
V +NT ,Vc
. (1)
The process N≤0 admits t 7→∫ t−−∞
h(t−x)N≤0(dx) as an intensity on (0,∞).
N>0 = Nanc ∪
( ⋃X∈Nanc
X +NXc
). (2)
The process N>0 admits t 7→ µ +∫ t−0 h(t−x)N>0(dx) as an intensity on (0,∞).
Proposition (Hawkes, 1974)
The processes N≤0 and N>0 are independent and
N = N≤0∪N>0
has intensity on (0,∞) given by
λ (t,FNt−) = µ +
∫ t−
−∞
h(t−x)N(dx).
![Page 96: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/96.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
Linear Hawkes process
Coming back to the conditional expectation
ρλ ,P0(t,s) = ρµ,hP0
(t,s) (in this case) is hard to compute directly. We prefer
Φµ,hP0
(t,s) = E[λ (t,FN
t−)∣∣∣St− ≥ s
]
= E[
µ +∫ t−
0h(t−x)N>0(dx)
∣∣∣∣Et,s(N>0)
]+E[∫ t−
−∞
h(t−v)N≤0(dv)
∣∣∣∣Et,s(N≤0)
]= Φ
µ,h+ (t,s) + Φ
µ,h−,P0
(t,s).
For any point process N and any real numbers s < t, let
Et,s(N) = {N ∩ (t− s,t) = /0}= {St− ≥ s}.
![Page 97: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/97.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
Linear Hawkes process
Coming back to the conditional expectation
ρλ ,P0(t,s) = ρµ,hP0
(t,s) (in this case) is hard to compute directly. We prefer
Φµ,hP0
(t,s) = E[λ (t,FN
t−)∣∣∣Et,s(N)
]
= E[
µ +∫ t−
0h(t−x)N>0(dx)
∣∣∣∣Et,s(N>0)
]+E[∫ t−
−∞
h(t−v)N≤0(dv)
∣∣∣∣Et,s(N≤0)
]= Φ
µ,h+ (t,s) + Φ
µ,h−,P0
(t,s).
For any point process N and any real numbers s < t, let
Et,s(N) = {N ∩ (t− s,t) = /0}= {St− ≥ s}.
![Page 98: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/98.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
Linear Hawkes process
Coming back to the conditional expectation
ρλ ,P0(t,s) = ρµ,hP0
(t,s) (in this case) is hard to compute directly. We prefer
Φµ,hP0
(t,s) = E[λ (t,FN
t−)∣∣∣Et,s(N)
]= E
[µ +
∫ t−
0h(t−x)N>0(dx)
∣∣∣∣Et,s(N>0)
]+E[∫ t−
−∞
h(t−v)N≤0(dv)
∣∣∣∣Et,s(N≤0)
]
= Φµ,h+ (t,s) + Φ
µ,h−,P0
(t,s).
For any point process N and any real numbers s < t, let
Et,s(N) = {N ∩ (t− s,t) = /0}= {St− ≥ s}.
![Page 99: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/99.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
Linear Hawkes process
Coming back to the conditional expectation
ρλ ,P0(t,s) = ρµ,hP0
(t,s) (in this case) is hard to compute directly. We prefer
Φµ,hP0
(t,s) = E[λ (t,FN
t−)∣∣∣Et,s(N)
]= E
[µ +
∫ t−
0h(t−x)N>0(dx)
∣∣∣∣Et,s(N>0)
]+E[∫ t−
−∞
h(t−v)N≤0(dv)
∣∣∣∣Et,s(N≤0)
]= Φ
µ,h+ (t,s) + Φ
µ,h−,P0
(t,s).
For any point process N and any real numbers s < t, let
Et,s(N) = {N ∩ (t− s,t) = /0}= {St− ≥ s}.
![Page 100: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/100.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
Linear Hawkes process
Coming back to the conditional expectation
ρλ ,P0(t,s) = ρµ,hP0
(t,s) (in this case) is hard to compute directly. We prefer
Φµ,hP0
(t,s) = E[λ (t,FN
t−)∣∣∣Et,s(N)
]= E
[µ +
∫ t−
0h(t−x)N>0(dx)
∣∣∣∣Et,s(N>0)
]+E[∫ t−
−∞
h(t−v)N≤0(dv)
∣∣∣∣Et,s(N≤0)
]= Φ
µ,h+ (t,s) + Φ
µ,h−,P0
(t,s).
No general formula available for Φµ,h−,P0
. Two cases are studied in the article:
N− is a Poisson process.
N− is a one point process (N− = {T0}).
![Page 101: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/101.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
Linear Hawkes process
Φµ,h+ (t,s) = E
[µ +
∫ t−
0h(t−x)N>0(dx)︸ ︷︷ ︸
intensity of N>0
∣∣∣∣∣Et,s(N>0)
].
= µ +Lµ,hs (t).
Lemma
Let N be a linear Hawkes process with
λ (t,FNt−) = g(t) +
∫ t−
0h(t−x)N(dx),
and ||h||1 < 1. Let Lg ,hs (x) = E[∫ x
0h(x−z)N(dz)
∣∣∣∣Ex ,s(N)
]Gg ,hs (x) = P(Ex ,s(N)) ,
for any x ,s ≥ 0. Then, for any x ,s ≥ 0,Lg ,hs (x) =
∫ 0∨(x−s)
0
(h (x− z) +Lh,hs (x− z)
)Gh,hs (x− z)g(z)dz ,
Gg ,hs (x) = exp
(−∫ x
x−sg(v)dv
)exp(−∫ x−s
0[1−Gh,h
s (x−v)]g(v)dv
).
![Page 102: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/102.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
Linear Hawkes process
Φµ,h+ (t,s) = E
[µ +
∫ t−
0h(t−x)N>0(dx)︸ ︷︷ ︸
intensity of N>0
∣∣∣∣∣Et,s(N>0)
]
.
= µ +Lµ,hs (t).
Lemma
Let N be a linear Hawkes process with
λ (t,FNt−) = g(t) +
∫ t−
0h(t−x)N(dx),
and ||h||1 < 1. Let Lg ,hs (x) = E[∫ x
0h(x−z)N(dz)
∣∣∣∣Ex ,s(N)
]Gg ,hs (x) = P(Ex ,s(N)) ,
for any x ,s ≥ 0. Then, for any x ,s ≥ 0,Lg ,hs (x) =
∫ 0∨(x−s)
0
(h (x− z) +Lh,hs (x− z)
)Gh,hs (x− z)g(z)dz ,
Gg ,hs (x) = exp
(−∫ x
x−sg(v)dv
)exp(−∫ x−s
0[1−Gh,h
s (x−v)]g(v)dv
).
![Page 103: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/103.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
Linear Hawkes process
Φµ,h+ (t,s) = E
[µ +
∫ t−
0h(t−x)N>0(dx)︸ ︷︷ ︸
intensity of N>0
∣∣∣∣∣Et,s(N>0)
]
.
= µ +Lµ,hs (t).
Lemma
Let N be a linear Hawkes process with
λ (t,FNt−) = g(t) +
∫ t−
0h(t−x)N(dx),
and ||h||1 < 1. Let Lg ,hs (x) = E[∫ x
0h(x−z)N(dz)
∣∣∣∣Ex ,s(N)
]Gg ,hs (x) = P(Ex ,s(N)) ,
for any x ,s ≥ 0. Then, for any x ,s ≥ 0,Lg ,hs (x) =
∫ 0∨(x−s)
0
(h (x− z) +Lh,hs (x− z)
)Gh,hs (x− z)g(z)dz ,
Gg ,hs (x) = exp
(−∫ x
x−sg(v)dv
)exp(−∫ x−s
0[1−Gh,h
s (x−v)]g(v)dv
).
![Page 104: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/104.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
Linear Hawkes process
Φµ,h+ (t,s) = E
[µ +
∫ t−
0h(t−x)N>0(dx)︸ ︷︷ ︸
intensity of N>0
∣∣∣∣∣Et,s(N>0)
]
.
= µ +Lµ,hs (t).
Lemma
Let N be a linear Hawkes process with
λ (t,FNt−) = g(t) +
∫ t−
0h(t−x)N(dx),
and ||h||1 < 1. Let Lg ,hs (x) = E[∫ x
0h(x−z)N(dz)
∣∣∣∣Ex ,s(N)
]Gg ,hs (x) = P(Ex ,s(N)) ,
for any x ,s ≥ 0. Then, for any x ,s ≥ 0,Lg ,hs (x) =
∫ 0∨(x−s)
0
(h (x− z) +Lh,hs (x− z)
)Gh,hs (x− z)g(z)dz ,
Gg ,hs (x) = exp
(−∫ x
x−sg(v)dv
)exp(−∫ x−s
0[1−Gh,h
s (x−v)]g(v)dv
).
![Page 105: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/105.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
Linear Hawkes process
Φµ,h+ (t,s) = E
[µ +
∫ t−
0h(t−x)N>0(dx)︸ ︷︷ ︸
intensity of N>0
∣∣∣∣∣Et,s(N>0)
]
.
= µ +Lµ,hs (t).
Lemma
Let N be a linear Hawkes process with
λ (t,FNt−) = g(t) +
∫ t−
0h(t−x)N(dx),
and ||h||1 < 1. Let Lg ,hs (x) = E[∫ x
0h(x−z)N(dz)
∣∣∣∣Ex ,s(N)
]Gg ,hs (x) = P(Ex ,s(N)) ,
for any x ,s ≥ 0. Then, for any x ,s ≥ 0,Lg ,hs (x) =
∫ 0∨(x−s)
0
(h (x− z) +Lh,hs (x− z)
)Gh,hs (x− z)g(z)dz ,
Gg ,hs (x) = exp
(−∫ x
x−sg(v)dv
)exp(−∫ x−s
0[1−Gh,h
s (x−v)]g(v)dv
).
![Page 106: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/106.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
Linear Hawkes process
Φµ,h+ (t,s) = E
[µ +
∫ t−
0h(t−x)N>0(dx)︸ ︷︷ ︸
intensity of N>0
∣∣∣∣∣Et,s(N>0)
]
.
= µ +Lµ,hs (t).
Lemma
Let N be a linear Hawkes process with
λ (t,FNt−) = g(t) +
∫ t−
0h(t−x)N(dx),
and ||h||1 < 1. Let Lg ,hs (x) = E[∫ x
0h(x−z)N(dz)
∣∣∣∣Ex ,s(N)
]Gg ,hs (x) = P(Ex ,s(N)) ,
for any x ,s ≥ 0. Then, for any x ,s ≥ 0,Lg ,hs (x) =
∫ 0∨(x−s)
0
(h (x− z) +Lh,hs (x− z)
)Gh,hs (x− z)g(z)dz ,
Gg ,hs (x) = exp
(−∫ x
x−sg(v)dv
)exp(−∫ x−s
0[1−Gh,h
s (x−v)]g(v)dv
).
![Page 107: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/107.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
Linear Hawkes process
Overview
Ls and Gs are characterized by their implicit equations.
Φµ,h+ (t,s) = µ +L
µ,hs (t) and Φ
µ,h−,P0
(at least in two cases) are known, and
so Φµ,hP0
= Φµ,h+ + Φ
µ,h−,P0
.
Remind that Φµ,hP0
(t,s) = E[λ(t,FN
t−)∣∣St− ≥ s
],
Hence ρµ,hP0
(t,s) = E[λ(t,FN
t−)∣∣St− = s
]can be recovered as the
derivative of Φµ,hP0
.
![Page 108: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/108.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
Linear Hawkes process
Overview
Ls and Gs are characterized by their implicit equations.
Φµ,h+ (t,s) = µ +L
µ,hs (t) and Φ
µ,h−,P0
(at least in two cases) are known, and
so Φµ,hP0
= Φµ,h+ + Φ
µ,h−,P0
.
Remind that Φµ,hP0
(t,s) = E[λ(t,FN
t−)∣∣St− ≥ s
],
Hence ρµ,hP0
(t,s) = E[λ(t,FN
t−)∣∣St− = s
]can be recovered as the
derivative of Φµ,hP0
.
![Page 109: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/109.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
Linear Hawkes process
Overview
Ls and Gs are characterized by their implicit equations.
Φµ,h+ (t,s) = µ +L
µ,hs (t) and Φ
µ,h−,P0
(at least in two cases) are known, and
so Φµ,hP0
= Φµ,h+ + Φ
µ,h−,P0
.
Remind that Φµ,hP0
(t,s) = E[λ(t,FN
t−)∣∣St− ≥ s
],
Hence ρµ,hP0
(t,s) = E[λ(t,FN
t−)∣∣St− = s
]can be recovered as the
derivative of Φµ,hP0
.
![Page 110: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/110.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
Linear Hawkes process
Overview
Ls and Gs are characterized by their implicit equations.
Φµ,h+ (t,s) = µ +L
µ,hs (t) and Φ
µ,h−,P0
(at least in two cases) are known, and
so Φµ,hP0
= Φµ,h+ + Φ
µ,h−,P0
.
Remind that Φµ,hP0
(t,s) = E[λ(t,FN
t−)∣∣St− ≥ s
],
Hence ρµ,hP0
(t,s) = E[λ(t,FN
t−)∣∣St− = s
]can be recovered as the
derivative of Φµ,hP0
.
![Page 111: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/111.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
Summary
Microscopic system.
System in expectation.
Population-based version. No dependence between neurons.
Outlook:
I Regularity of u.I Mean field limit. Propagation of chaos.I Multivariate Hawkes processes with weak interaction.
![Page 112: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/112.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
Summary
Microscopic system.
System in expectation.
Population-based version. No dependence between neurons.
Outlook:
I Regularity of u.I Mean field limit. Propagation of chaos.I Multivariate Hawkes processes with weak interaction.
![Page 113: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/113.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
Summary
Microscopic system.
System in expectation.
Population-based version. No dependence between neurons.
Outlook:
I Regularity of u.I Mean field limit. Propagation of chaos.I Multivariate Hawkes processes with weak interaction.
![Page 114: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/114.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
Summary
Microscopic system.
System in expectation.
Population-based version. No dependence between neurons.
Outlook:
I Regularity of u.I Mean field limit. Propagation of chaos.I Multivariate Hawkes processes with weak interaction.
![Page 115: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/115.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
Summary
Microscopic system.
System in expectation.
Population-based version. No dependence between neurons.
Outlook:I Regularity of u.
I Mean field limit. Propagation of chaos.I Multivariate Hawkes processes with weak interaction.
![Page 116: Microscopic approach of a time elapsed neural model](https://reader031.fdocuments.net/reader031/viewer/2022011817/61d53a94d595d9606a2961d4/html5/thumbnails/116.jpg)
Introduction Point process Microscopic measure Expectation measure Coming back to our examples Summary
Summary
Microscopic system.
System in expectation.
Population-based version. No dependence between neurons.
Outlook:I Regularity of u.I Mean field limit. Propagation of chaos.I Multivariate Hawkes processes with weak interaction.