Spectral Clustering for Dynamic Block...
Transcript of Spectral Clustering for Dynamic Block...
Spectral Clustering for Dynamic Block Models
Sharmodeep Bhattacharyya
Department of Statistics
Oregon State University
January 23, 2017
Research Computing Seminar, OSU, Corvallis
(Joint work with Shirshendu Chatterjee, City College, CUNY)
Sharmodeep Bhattacharyya (Oregon State) Dynamic Spectral Clustering January 23, 2017 1 / 53
Outline
1 Introduction and Motivation
2 Community Detection in Networks
Community Detection Algorithms
Community Detection Algorithms: Spectral Methods
3 Feature and Models of Networks
Dynamic Network Models
Spectral Clustering Methods
Theoretical Results
4 Resullts
Simulation Resullts
Real Networks: Neuroscience Example
5 SummarySharmodeep Bhattacharyya (Oregon State) Dynamic Spectral Clustering January 23, 2017 2 / 53
Introduction and Motivation
Networks
Networks Nodes Edges
Social Network People Friendship/kinship
Biological Network Gene/Protein Interaction
Citation Networks Papers citation
Sharmodeep Bhattacharyya (Oregon State) Dynamic Spectral Clustering January 23, 2017 3 / 53
Introduction and Motivation
Network Data
G = (V ,E ): undirected graph and V = {v1, · · · , vn} arbitrarily labeled vertices.
Adjacency matrices (Symmetric), [Aij ]ni ,j=1 numerically represent network data:
Aij =
1 if node i links to node j ,
0 otherwise.
Sharmodeep Bhattacharyya (Oregon State) Dynamic Spectral Clustering January 23, 2017 4 / 53
Introduction and Motivation
Drosophila protein interactions
Guruharsha et al., “A protein complex network of Drosophila melanogaster,” Cell, 147:690–703,
2011
Experimentally measured and scored protein interactions
1612 nodes; 10,421 edges (edge density ρ = 8.0× 10−3)
Sharmodeep Bhattacharyya (Oregon State) Dynamic Spectral Clustering January 23, 2017 5 / 53
Introduction and Motivation
Political blogs
Understanding political patterns
Adamic, Lada A., and Natalie Glance. "The political blogosphere and the 2004 US election:
divided they blog." Proceedings of the 3rd international workshop on Link discovery. ACM,
2005.
Sharmodeep Bhattacharyya (Oregon State) Dynamic Spectral Clustering January 23, 2017 6 / 53
Introduction and Motivation
Online Social Network
Figure: Facebook Social NetworkSharmodeep Bhattacharyya (Oregon State) Dynamic Spectral Clustering January 23, 2017 7 / 53
Introduction and Motivation
Dynamic/Time-varying Networks
Figure: Dynamic Network Examples
Sharmodeep Bhattacharyya (Oregon State) Dynamic Spectral Clustering January 23, 2017 8 / 53
Introduction and Motivation
A Motivating Example: Electro-Corticograph Array Data for Speech
Figure: a. MRI reconstruction of a single subject brain with vSMC electrodes (dots), colored according to distance from
the Sylvian fissure (black and red are the most dorsal and ventral positions, respectively). b. Expanded view of vSMC
anatomy: cs, central sulcus; PoCG, post-central gyrus; PrCG, pre-central gyrus; Sf, Sylvian fissure. c - e.Top, vocal tract
schematics for three consonants (/b/, /d/, /g/), produced by occlusion at the lips, tongue tip and tongue body,
respectively (red arrow). Middle, spectrograms of spoken consonant-vowel syllables (Bouchard et.al., Nature, 2013).
Sharmodeep Bhattacharyya (Oregon State) Dynamic Spectral Clustering January 23, 2017 9 / 53
Introduction and Motivation
Other Examples of Network Data
Biological Networks:
Biochemical pathway networks
Gene transcription networks
Epidemiological Networks
Social Networks:
Academic networks such as collaboration and citation networks
Networks arising from text-mining
Technological Networks:
Internet
Cell-phone tower and telephone exchange networks
Airport and Transport Networks
Sharmodeep Bhattacharyya (Oregon State) Dynamic Spectral Clustering January 23, 2017 10 / 53
Introduction and Motivation
Two Main Classes of Problems for Networks
(I) Formation of networks given information on vertices as data.
(II) Inference on networks given complete network with node and edge
structure as data.
Sharmodeep Bhattacharyya (Oregon State) Dynamic Spectral Clustering January 23, 2017 11 / 53
Introduction and Motivation
Two Main Classes of Problems for Networks
(I) Formation of networks given information on vertices.
(II) Inference on networks given a complete network with node and
edge structure.
Sharmodeep Bhattacharyya (Oregon State) Dynamic Spectral Clustering January 23, 2017 12 / 53
Introduction and Motivation
Commonly Questions Asked
Community Detection.
Link Prediction.
Covariate or Latent Variable Estimation.
Sampling of nodes and subgraphs.
Dynamic network inference and information exchange in networks.
Most of these questions can be answered by performing inference on
network models.
Sharmodeep Bhattacharyya (Oregon State) Dynamic Spectral Clustering January 23, 2017 13 / 53
Introduction and Motivation
Commonly Questions Asked
Community Detection
Link Prediction
Covariate or Latent Variable Estimation
Sampling of nodes and subgraphs
Information exchange
Most of these questions can be answered by performing inference on
network models.
Sharmodeep Bhattacharyya (Oregon State) Dynamic Spectral Clustering January 23, 2017 14 / 53
Introduction and Motivation
Community in Networks
Physical Topological
Definition How to Find
Topological Nodes within a community has more edges Community detection algorithms
among themselves than with nodes proposed by Statisticians/
outside community in average Computer Scientists/ Mathematicians
Physical Nodes or Edges within community Verified by Scientists
have some shared property
Sharmodeep Bhattacharyya (Oregon State) Dynamic Spectral Clustering January 23, 2017 15 / 53
Community Detection in Networks Community Detection Algorithms
Outline
1 Introduction and Motivation
2 Community Detection in Networks
Community Detection Algorithms
Community Detection Algorithms: Spectral Methods
3 Feature and Models of Networks
Dynamic Network Models
Spectral Clustering Methods
Theoretical Results
4 Resullts
Simulation Resullts
Real Networks: Neuroscience Example
5 SummarySharmodeep Bhattacharyya (Oregon State) Dynamic Spectral Clustering January 23, 2017 16 / 53
Community Detection in Networks Community Detection Algorithms
Community Detection Algorithms
Popular algorithms for community detection are -
1 Modularity maximizing methods. (Newman and Girvan (2006))
2 Spectral clustering based methods. (McSherry (2001))
3 Likelihood and its approximation maximization
(a) Profile Likelihood Maximization (Bickel and Chen (2009)).
(b) Variational Likelihood Maximization. (Celisse et. al. (2011))
(c) Pseudo-likelihood Maximization (Chen et. al. (2012)).
Sharmodeep Bhattacharyya (Oregon State) Dynamic Spectral Clustering January 23, 2017 17 / 53
Community Detection in Networks Community Detection Algorithms: Spectral Methods
Outline
1 Introduction and Motivation
2 Community Detection in Networks
Community Detection Algorithms
Community Detection Algorithms: Spectral Methods
3 Feature and Models of Networks
Dynamic Network Models
Spectral Clustering Methods
Theoretical Results
4 Resullts
Simulation Resullts
Real Networks: Neuroscience Example
5 SummarySharmodeep Bhattacharyya (Oregon State) Dynamic Spectral Clustering January 23, 2017 18 / 53
Community Detection in Networks Community Detection Algorithms: Spectral Methods
General Spectral Clustering Algorithm
Sharmodeep Bhattacharyya (Oregon State) Dynamic Spectral Clustering January 23, 2017 19 / 53
Community Detection in Networks Community Detection Algorithms: Spectral Methods
Well-known Examples of Mn
For community identification in network, there are some well-known
operators Mn.
Adjacency matrix Mn = A (Sussman et.al (2012))
Normalized Laplacian matrices Mn = Lrwn = D−1Ln and
Lsymn = D−1/2LnD−1/2 with Ln = D − An (Rohe. et.al. (2011)).
These operators although perform well in regime (a) fail to perform
well in both regime (b) and (c) described previously.
Sharmodeep Bhattacharyya (Oregon State) Dynamic Spectral Clustering January 23, 2017 20 / 53
Community Detection in Networks Community Detection Algorithms: Spectral Methods
Mn for Sparse Networks
For community identification in sparse networks, there are some regularized
variations of Ln or An.
Adjacency matrix Aτ = A + τ11T , where 1 is a vector of 1’s of length n.
(Amini et.al (2012))
Laplacian matrix Lτn = (D + τ I )−1/2A(D + τ I )−1/2 (Chaudhuri. et.al. (2012)).
Trimmed adjacency matrix Aτ , where, high-degree nodes are trimmed
(Coja-Oghlan (2010)).
Theoretical performance of first two regularized operators for sparse networks
is under investigation.
Sharmodeep Bhattacharyya (Oregon State) Dynamic Spectral Clustering January 23, 2017 21 / 53
Feature and Models of Networks Dynamic Models
Outline
1 Introduction and Motivation
2 Community Detection in Networks
Community Detection Algorithms
Community Detection Algorithms: Spectral Methods
3 Feature and Models of Networks
Dynamic Network Models
Spectral Clustering Methods
Theoretical Results
4 Resullts
Simulation Resullts
Real Networks: Neuroscience Example
5 SummarySharmodeep Bhattacharyya (Oregon State) Dynamic Spectral Clustering January 23, 2017 22 / 53
Feature and Models of Networks Dynamic Models
A Motivating Example: Electro-Corticograph Array Data for Speech
Figure: a. MRI reconstruction of a single subject brain with vSMC electrodes (dots), colored according to distance from
the Sylvian fissure (black and red are the most dorsal and ventral positions, respectively). b. Expanded view of vSMC
anatomy: cs, central sulcus; PoCG, post-central gyrus; PrCG, pre-central gyrus; Sf, Sylvian fissure. c - e.Top, vocal tract
schematics for three consonants (/b/, /d/, /g/), produced by occlusion at the lips, tongue tip and tongue body,
respectively (red arrow). Middle, spectrograms of spoken consonant-vowel syllables (Bouchard et.al., Nature, 2013).
Sharmodeep Bhattacharyya (Oregon State) Dynamic Spectral Clustering January 23, 2017 23 / 53
Feature and Models of Networks Dynamic Models
Dynamic Network Models: A Myopic Review
Dynamic time-evolving formation of networks: Barabasi and Albert
(1999) and a large literature.
Extension of static models of network:
Latent space models, Sarkar and Moore (2005), Sewell and Chen (2014).
Mixed membership block models, Xing et.al. (2010), Ho et.al. (2011).
Random dot-product models, Tang et.al. (2013).
Stochastic block models, Xu et.al. (2013), Ghasemian et.al. (2015).
Graphon models, Crane (2015).
Bayesian models: Ho et.al. (2011), Durante et.al. (2014).
Sharmodeep Bhattacharyya (Oregon State) Dynamic Spectral Clustering January 23, 2017 24 / 53
Feature and Models of Networks Dynamic Models
Nonparametric Latent Variable Models
Derived from representation of exchangeable random infinite array by Aldous and Hoover
(1983).
NP ModelDefine P({Aij}ni ,j=1) conditionally given latent variables {ξi}ni=1 associated with vertices
{vi}ni=1 respectively. (Bickel & Chen (2009), Bollobás et.al. (2007), Hoff et.al. (2002)).
ξ1, . . . , ξniid∼ U(0, 1)
Pr(Aij = 1|ξi = u, ξj = v) = hn(u, v) = ρnw(u, v),
Sharmodeep Bhattacharyya (Oregon State) Dynamic Spectral Clustering January 23, 2017 25 / 53
Feature and Models of Networks Dynamic Models
Nonparametric Latent Variable Models
Derived from representation of exchangeable random infinite array by Aldous and Hoover
(1983).
NP ModelDefine P({Aij}ni ,j=1) conditionally given latent variables {ξi}ni=1 associated with vertices
{vi}ni=1 respectively. (Bickel & Chen (2009), Bollobás et.al. (2007), Hoff et.al. (2002)).
ξ1, . . . , ξniid∼ U(0, 1)
Pr(Aij = 1|ξi = u, ξj = v) = hn(u, v) = ρnw(u, v),
w(u, v) is the conditional latent variable density given Aij = 1.
Define λn ≡ nρn as the expected degree parameter and P = [Pij ]ni,j = [ρnw(ξi , ξj)]ni,j .
hn: not uniquely defined. hn(ϕ(u), ϕ(v)
), with measure-preserving ϕ, gives same model.
Sharmodeep Bhattacharyya (Oregon State) Dynamic Spectral Clustering January 23, 2017 26 / 53
Feature and Models of Networks Dynamic Models
Stochastic Block Model (Holland, Laskey and Leinhardt 1983)
A K -block stochastic block model with parameters (π,P) is defined as follows.
Consider latent variable corresponding to vertices as z = (z1, z2, . . . , zn) with
z1, . . . , zniid∼ Multinomial(1; (π1, . . . , πK ))
Pr(Aij = 1|zi , zj) = Pzizj ,
where P = [Pab] is a K × K symmetric matrix for undirected networks.
Sharmodeep Bhattacharyya (Oregon State) Dynamic Spectral Clustering January 23, 2017 27 / 53
Feature and Models of Networks Dynamic Models
Dynamic Nonparametric Latent Variable Models
Now we try to introduce a time component to the exchangeable model. The most
general version of the model becomes
ξ0iiid∼ U(0, 1) (1)
ξti |(ξt−1i = u)iid∼ F (u) (2)
P(A
(t)ij = 1|ξti = u, ξtj = v ,A
(t−1)ij = z
)= hn(u, v , z , t) = ρnw(u, v , z , t) (3)
where, F is an univariate distribution and 0 ≤ hn ≤ 1 and 0 ≤ t ≤ T is the time variable.
Random re-wiring mechanism: hn depends on both t and z (Harry Crane, 2015).
Evolving Communities: hn depends on (u, v) only, F non-trivial (Ghasemian et.al., 2015).
Sharmodeep Bhattacharyya (Oregon State) Dynamic Spectral Clustering January 23, 2017 28 / 53
Feature and Models of Networks Dynamic Models
Dynamic Stochastic Block Model (DSBM)
Specialize to Dynamic Stochastic Block Model with parameters (π,B) and
latent variables z ,
z1, . . . , zniid∼ Mult(1; (π1, . . . , πK )), (4)
P(A
(t)ij = 1
∣∣∣ zi , zj) = B(t)zizj . (5)
where, Bt = [Btab] are K ×K symmetric matrix for undirected networks for
each time step t and 0 ≤ t ≤ T is the time variable.
Sharmodeep Bhattacharyya (Oregon State) Dynamic Spectral Clustering January 23, 2017 29 / 53
Feature and Models of Networks Dynamic Models
Dynamic Degree Corrected Block Model (DDCBM)
Specialize to Dynamic Degree Corrected Block Model with parameters
(π,B,ψ) and latent variables z ,
z1, . . . , zniid∼ Mult(1; (π1, . . . , πK )), (6)
P(A
(t)ij = 1
∣∣∣ zi , zj ,ψ) = ψiψjB(t)zizj . (7)
where, Bt = [Btab] are K ×K symmetric matrix for undirected networks for
each time step t and 0 ≤ t ≤ T is the time variable.
Sharmodeep Bhattacharyya (Oregon State) Dynamic Spectral Clustering January 23, 2017 30 / 53
Feature and Models of Networks Spectral Methods
Outline
1 Introduction and Motivation
2 Community Detection in Networks
Community Detection Algorithms
Community Detection Algorithms: Spectral Methods
3 Feature and Models of Networks
Dynamic Network Models
Spectral Clustering Methods
Theoretical Results
4 Resullts
Simulation Resullts
Real Networks: Neuroscience Example
5 SummarySharmodeep Bhattacharyya (Oregon State) Dynamic Spectral Clustering January 23, 2017 31 / 53
Feature and Models of Networks Spectral Methods
Dynamic Spectral Clustering Algorithms
Sharmodeep Bhattacharyya (Oregon State) Dynamic Spectral Clustering January 23, 2017 32 / 53
Feature and Models of Networks Theory
Outline
1 Introduction and Motivation
2 Community Detection in Networks
Community Detection Algorithms
Community Detection Algorithms: Spectral Methods
3 Feature and Models of Networks
Dynamic Network Models
Spectral Clustering Methods
Theoretical Results
4 Resullts
Simulation Resullts
Real Networks: Neuroscience Example
5 SummarySharmodeep Bhattacharyya (Oregon State) Dynamic Spectral Clustering January 23, 2017 33 / 53
Feature and Models of Networks Theory
First Method: In Detail
In the first method, we sum the adjacency matrices to obtain
A =T∑t=1
A(t).
We obtain leading K eigenvectors of A corresponding to its largest eigenvalues.
Suppose Un×K contains those eigenvectors as columns.
Then we use (1 + ε) approximate k-means clustering algorithm to obtain
Z ∈Mn,K and Θ ∈ RK×K such that
||Z Θ− U||2F 6 (1 + ε) minZ∈Mn×K ,Θ∈RK×K
||ZΘ− U||2F .
Z is the estimate of Z = (z1, . . . , zn) from this method.
Sharmodeep Bhattacharyya (Oregon State) Dynamic Spectral Clustering January 23, 2017 34 / 53
Feature and Models of Networks Theory
First Method: Consistency of Z
Adjacency matrices, A generated from the DSBM with n nodes and K
communities with parameters (π, {B(t)}Tt=1),
γn be the smallest non-zero singular value of P,
d := maxk,l∈[K ],t∈[T ] B(t)k,l · n be the maximum expected degree of a
node at any time.
Sharmodeep Bhattacharyya (Oregon State) Dynamic Spectral Clustering January 23, 2017 35 / 53
Feature and Models of Networks Theory
First Method: Consistency of Z
Theorem
Let A is generated from DSBM. Suppose γn is large enough so thatKγ2nmax{Td , log2 n/Td} = o(1). For any ε, c > 0, there is a constant
C = C (ε, c) > 0 such that if Z is the estimate of Z as described in Algorithm 1,
and if fi , i ∈ [K ] is the fraction of nodes belonging to Ci which are misclassified in
Z , thenK∑i=1
fi 6 CK
γ2nmax{Td , log2 n/Td}
with probability at least 1− n−c .
Sharmodeep Bhattacharyya (Oregon State) Dynamic Spectral Clustering January 23, 2017 36 / 53
Feature and Models of Networks Theory
First Method: Consistency of Z
Corollary
In the special case of Theorem when
(i) the minimum eigenvalue of ndB
(t) is positive and uniformly bounded away from
zero for all t ∈ [T ],
(ii) the community sizes are balanced, i.e. nmax/nmin = O(1),
then consistency holds for Z if either
Td > log(n) and K = o(Td), or
(log(n))2/3 << Td < log(n) and K = o((Td)3/(log(n))2).
Sharmodeep Bhattacharyya (Oregon State) Dynamic Spectral Clustering January 23, 2017 37 / 53
Feature and Models of Networks Theory
First Method: Consistency of Z
Corollary
In the special case of Theorem when
(i) the minimum eigenvalue of ndB
(t) is positive and uniformly bounded away from
zero for all t ∈ [T ],
(ii) the community sizes are balanced, i.e. nmax/nmin = O(1),
then consistency holds for Z if either
Td > log(n) and K = o(Td), or
(log(n))2/3 << Td < log(n) and K = o((Td)3/(log(n))2).
Sharmodeep Bhattacharyya (Oregon State) Dynamic Spectral Clustering January 23, 2017 38 / 53
Feature and Models of Networks Theory
Algorithm 2: In Detail
In the second method, we sum the squares of the adjacency matrices to obtain,
A[2] and then subtract its diagonal to obtain,¨A[2]∂,
A[2] :=T∑t=1
ÄA(t)ä2,¨A[2]∂
:=T∑t=1
⟨ÄA(t)ä2⟩
.
We obtain leading K eigenvectors of¨A[2]∂corresponding to its largest
eigenvalues. Suppose U ∈ Rn×K contains those eigenvectors as columns.
Then we use (1 + ε) approximate K -means clustering algorithm to obtain
Z ∈Mn,K and Θ ∈ RK×K such that
||Z Θ− U||2F 6 (1 + ε) minZ∈Mn×K ,Θ∈RK×K
||ZΘ− U||2F .
Z is the estimate of Z = (z1, . . . , zn) from this method.
Sharmodeep Bhattacharyya (Oregon State) Dynamic Spectral Clustering January 23, 2017 39 / 53
Feature and Models of Networks Theory
Consistency of Z
In order to prove consistency of Z , we need some notations and observations. Let
B[2] :=T∑t=1
Ä∆B(t)∆
ä2P[2] :=
T∑t=1
ÄP(t)ä2
= Z∆−1T∑t=1
Ä∆B(t)∆
ä2∆−1ZT (8)
The main assumption about the connection probabilities that we need is
At least one B(t), t ∈ [T ], must be nonsingular. (9)
Sharmodeep Bhattacharyya (Oregon State) Dynamic Spectral Clustering January 23, 2017 40 / 53
Feature and Models of Networks Theory
More Notations and Conditions for Consistency of Z
A is generated from DSBM with n nodes and K communities and the
parameters (aπ, {B(t)}Tt=1).
γn be the smallest non-zero singular value of P[2]
d := maxk,l∈[K ],t∈[T ] B(t)k,l · n be the maximum expected degree of a
node at any time.
Sharmodeep Bhattacharyya (Oregon State) Dynamic Spectral Clustering January 23, 2017 41 / 53
Feature and Models of Networks Theory
Second Method: Consistency of Z
Theorem
Let A is generated from DSBM satisfying assumption (9). Suppose γn is largeenough so that K
γ2n
(Td3(1 ∨ T−1d−1 log n + log10 n) = o(1). For any ε, c > 0, thereis a constant C = C (ε, c) > 0 such that if Z is the estimate of Z as described inAlgorithm 2, and if fi , i ∈ [K ] is the fraction of nodes belonging to Ci which aremisclassified in Z , then
K∑i=1
fi 6 CKTd3(1 ∨ T−1d−1 log n) + (Td2 log2(n) ∨ log10(n)) ∧ (Td2 ∨ log12(n))
γ2n
with probability at least 1− 4n−c .
Sharmodeep Bhattacharyya (Oregon State) Dynamic Spectral Clustering January 23, 2017 42 / 53
Feature and Models of Networks Theory
Second Method: Consistency of Z
Corollary
In the special case of Theorem when
(i) the number of nonsingular matrices among { ndB(t) : t ∈ [T ]} (whose
singular values are bounded away from 0 uniformly) grows faster than
max{d−2 log5 n,»T/d}, and
(ii) the community sizes are balanced, i.e. nmax/nmin = O(1),
then consistency holds for Z .
Sharmodeep Bhattacharyya (Oregon State) Dynamic Spectral Clustering January 23, 2017 43 / 53
Feature and Models of Networks Theory
Algorithm 3: Spherical Spectral Clustering
In the third method, obtain the sum of the squared adjacency matrices
without its diagonal,¨A[2]∂
:=∑T
t=1
⟨ÄA(t)ä2⟩
.
Obtain U ∈ Rn×K consisting of the leading K eigenvectors of¨A[2]∂
corresponding to its largest absolute eigenvalues.
Let n′+ be the number of nonzero rows of U. Obtain U+ ∈ Rn′+×K consisting
of the normalized nonzero rows of U, i.e. U+i ,∗ = Ui ,∗/
∥∥∥Ui ,∗∥∥∥2for i such that∥∥∥Ui ,∗
∥∥∥2> 0.
Use (1 + ε) approximate K -median clustering algorithm on the row vectors of
U+ to obtain “Z+ ∈Mn′+,Kand “X ∈ RK×K .
Extend “Z+ to obtain “Z by (arbitrarily) adding n − n′+ many canonical unit row
vectors at the end, such as, “Zi = (1, 0, . . . , 0) for i such that∥∥∥Ui ,∗
∥∥∥2
= 0.
“Z is the estimate of Z .
Sharmodeep Bhattacharyya (Oregon State) Dynamic Spectral Clustering January 23, 2017 44 / 53
Results Simulation
Outline
1 Introduction and Motivation
2 Community Detection in Networks
Community Detection Algorithms
Community Detection Algorithms: Spectral Methods
3 Feature and Models of Networks
Dynamic Network Models
Spectral Clustering Methods
Theoretical Results
4 Resullts
Simulation Resullts
Real Networks: Neuroscience Example
5 SummarySharmodeep Bhattacharyya (Oregon State) Dynamic Spectral Clustering January 23, 2017 45 / 53
Results Simulation
Simulation Results: DSBM
(a) (b)
Figure: (a) For Sparse network λn = 3 (b) Dense network, λn = 8.
Sharmodeep Bhattacharyya (Oregon State) Dynamic Spectral Clustering January 23, 2017 46 / 53
Results Simulation
Simulation Results: DSBM
(a)
Figure: Dense network, λn = 10, with B nearly singular.
Sharmodeep Bhattacharyya (Oregon State) Dynamic Spectral Clustering January 23, 2017 47 / 53
Results Simulation
Simulation Results: DDCBM
(a) (b)
Figure: Dense: (a) B nearly singular (b) B non-singular.
Sharmodeep Bhattacharyya (Oregon State) Dynamic Spectral Clustering January 23, 2017 48 / 53
Results Real Networks
Outline
1 Introduction and Motivation
2 Community Detection in Networks
Community Detection Algorithms
Community Detection Algorithms: Spectral Methods
3 Feature and Models of Networks
Dynamic Network Models
Spectral Clustering Methods
Theoretical Results
4 Resullts
Simulation Resullts
Real Networks: Neuroscience Example
5 SummarySharmodeep Bhattacharyya (Oregon State) Dynamic Spectral Clustering January 23, 2017 49 / 53
Results Real Networks
Neuroscience ECoG Example
Figure: Clustering of the network correctly identifies the lip region (upper right hand part of
the vSMC) involved in the production of /b/, which engages the lips. (a): Location of Electrode
Clusters based on BolBO-based graph Estimation (b): Organization of articulator
representations in the vSMC (black: larynx; red: lips; blue: tongue; green: jaw). (c): Estimated
graph of electrodes.Sharmodeep Bhattacharyya (Oregon State) Dynamic Spectral Clustering January 23, 2017 50 / 53
Conclusion
Outline
1 Introduction and Motivation
2 Community Detection in Networks
Community Detection Algorithms
Community Detection Algorithms: Spectral Methods
3 Feature and Models of Networks
Dynamic Network Models
Spectral Clustering Methods
Theoretical Results
4 Resullts
Simulation Resullts
Real Networks: Neuroscience Example
5 SummarySharmodeep Bhattacharyya (Oregon State) Dynamic Spectral Clustering January 23, 2017 51 / 53
Conclusion
Summary and Future Works
SummaryWe consider two methods of spectral clustering for dynamic SBM.
We give theoretical justifications of each method.
Works in ProgressExtension of more general dynamic SBM.
Extension of dynamic models.
Sharmodeep Bhattacharyya (Oregon State) Dynamic Spectral Clustering January 23, 2017 52 / 53
Conclusion
Future Problems in Networks
MethodologicalDetection of dynamic communities.
Detection of communities in presence of covariates.
Comparison of networks and communities for multiple networks.
TheoreticalCondition for community recovery for general K and connectivity matrix.
Condition for community recovery for dynamic networks.
Condition for community recovery for networks with covariate information.
Sharmodeep Bhattacharyya (Oregon State) Dynamic Spectral Clustering January 23, 2017 53 / 53