Graph Sparsification III: Ramanujan Graphs, Lifts, and ...Graph Sparsification III: Ramanujan...
Transcript of Graph Sparsification III: Ramanujan Graphs, Lifts, and ...Graph Sparsification III: Ramanujan...
Graph Sparsification III: Ramanujan Graphs, Lifts, and Interlacing Families
Nikhil Srivastava
Microsoft Research India
Simons Institute, August 27, 2014
The Last Two Lectures
Lecture 1. Every weighted undirected πΊ has a
weighted subgraph π» with ππ log π
π2 edges
which satisfiesπΏπΊ βΌ πΏπ» βΌ 1 + π πΏπΊ
random sampling
The Last Two Lectures
Lecture 1. Every weighted undirected πΊ has a
weighted subgraph π» with ππ log π
π2 edges
which satisfiesπΏπΊ βΌ πΏπ» βΌ 1 + π πΏπΊ
Lecture 2. Improved this to 4 π π2.
greedy
The Last Two Lectures
Lecture 1. Every weighted undirected πΊ has a
weighted subgraph π» with ππ log π
π2 edges
which satisfiesπΏπΊ βΌ πΏπ» βΌ 1 + π πΏπΊ
Lecture 2. Improved this to 4 π π2.
Suboptimal for πΎπ in two ways: weights, and 2π/π2.
|EH| = O(dn)
πΏπΎπ(1 β π) βΌ πΏπ» βΌ πΏπΎπ
1 + π
G=Kn H = random d-regular x (n/d)
|EG| = O(n2)
Good Sparsifiers of πΎπ
|EH| = O(dn)
π(1 β π) βΌ πΏπ» βΌ π 1 + π
G=Kn H = random d-regular x (n/d)
|EG| = O(n2)
Good Sparsifiers of πΎπ
G=Kn H = random d-regular
|EH| = dn/2
π(1 β π) βΌ πΏπ» βΌ π 1 + π
|EG| = O(n2)
Regular Unweighted Sparsifiers of πΎπ
Rescale weights back
to 1
G=Kn H = random d-regular
|EH| = dn/2
π β 2 π β 1 βΌ πΏπ» βΌ π + 2 π β 1
|EG| = O(n2)
Regular Unweighted Sparsifiers of πΎπ
[Friedmanβ08]
+π(π)
G=Kn H = random d-regular
|EH| = dn/2
π β 2 π β 1 βΌ πΏπ» βΌ π + 2 π β 1
|EG| = O(n2)
Regular Unweighted Sparsifiers of πΎπ
[Friedmanβ08]
+π(π)
will try to match this
Why do we care so much about πΎπ?
Unweighted d-regular approximations of πΎπ are called expanders.
They behave like random graphs:
the right # edges across cuts
fast mixing of random walks
Prototypical βpseudorandom objectβ. Many uses in CS and math (Routing, Coding, Complexityβ¦)
Switch to Adjacency Matrix
Let G be a graph and A be its adjacency matrix
eigenvalues π1 β₯ π2 β₯ β―ππ
a
c
d
eb
0 1 0 0 11 0 1 0 10 1 0 1 00 0 1 0 11 1 0 1 0
πΏ = ππΌ β π΄
Switch to Adjacency Matrix
Let G be a graph and A be its adjacency matrix
eigenvalues π1 β₯ π2 β₯ β―ππIf d-regular, then π΄π = ππ so π1 = πIf bipartite then eigs are symmetric
about zero so ππ = βπ
a
c
d
eb
0 1 0 0 11 0 1 0 10 1 0 1 00 0 1 0 11 1 0 1 0
βtrivialβ
Spectral Expanders
Definition: G is a good expander if all non-trivial eigenvalues are small
0[ ]-d d
Spectral Expanders
Definition: G is a good expander if all non-trivial eigenvalues are small
0[ ]-d d
e.g. πΎπ and πΎπ,π have all nontrivial eigs 0.
Spectral Expanders
Definition: G is a good expander if all non-trivial eigenvalues are small
0[ ]-d d
e.g. πΎπ and πΎπ,π have all nontrivial eigs 0.Challenge: construct infinite families.
Spectral Expanders
Definition: G is a good expander if all non-trivial eigenvalues are small
0[ ]-d d
e.g. πΎπ and πΎπ,π have all nontrivial eigs 0.Challenge: construct infinite families.
Alon-Boppanaβ86: Canβt beat
[β2 π β 1, 2 π β 1]
The meaning of 2 π β 1
The infinite d-ary tree
π π΄π = [β2 π β 1, 2 π β 1]
The meaning of 2 π β 1
The infinite d-ary tree
π π΄π = [β2 π β 1, 2 π β 1]
Alon-Boppanaβ86: This is the best possible spectral expander.
Ramanujan Graphs:
Definition: G is Ramanujan if all non-trivial eigs
have absolute value at most 2 π β 1
0[ ]-d d
[-2 π β 1
]2 π β 1
Ramanujan Graphs:
Definition: G is Ramanujan if all non-trivial eigs
have absolute value at most 2 π β 1
0[ ]-d d
[-2 π β 1
]2 π β 1
Friedmanβ08: A random d-regular graph is almost
Ramanujan : 2 π β 1 + π(1)
Ramanujan Graphs:
Definition: G is Ramanujan if all non-trivial eigs
have absolute value at most 2 π β 1
0[ ]-d d
[-2 π β 1
]2 π β 1
Friedmanβ08: A random d-regular graph is almost
Ramanujan : 2 π β 1 + π(1)
Margulis, Lubotzky-Phillips-Sarnakβ88: Infinite sequences of Ramanujan graphs exist for π = π + 1
Ramanujan Graphs:
Definition: G is Ramanujan if all non-trivial eigs
have absolute value at most 2 π β 1
0[ ]-d d
[-2 π β 1
]2 π β 1
Friedmanβ08: A random d-regular graph is almost
Ramanujan : 2 π β 1 + π(1)
Margulis, Lubotzky-Phillips-Sarnakβ88: Infinite sequences of Ramanujan graphs exist for π = π + 1
What about π β π + 1?
[Marcus-Spielman-Sβ13]
Theorem. Infinite families of bipartite Ramanujangraphs exist for every π β₯ 3.
[Marcus-Spielman-Sβ13]
Theorem. Infinite families of bipartite Ramanujangraphs exist for every π β₯ 3.
Proof is elementary, doesnβt use number theory.
Not explicit.
Based on a new existence argument: method of interlacing families of polynomials.
[Marcus-Spielman-Sβ13]
Theorem. Infinite families of bipartite Ramanujangraphs exist for every π β₯ 3.
Proof is elementary, doesnβt use number theory.
Not explicit.
Based on a new existence argument: method of interlacing families of polynomials.
πΌπ π = 1 β1
π
π
ππ₯
π
π₯π
Bilu-Linialβ06 Approach
Find an operation which doubles the size of a graph without blowing up its eigenvalues.
0[ ]-d d
[-2 π β 1
]2 π β 1
Bilu-Linialβ06 Approach
Find an operation which doubles the size of a graph without blowing up its eigenvalues.
0[ ]-d d
[-2 π β 1
]2 π β 1
Bilu-Linialβ06 Approach
Find an operation which doubles the size of a graph without blowing up its eigenvalues.
0[ ]-d d
[-2 π β 1
]2 π β 1
β¦β
2-lifts of graphs
a
c
d
e
b
2-lifts of graphs
a
c
d
e
b
a
c
d
e
b
duplicate every vertex
2-lifts of graphs
duplicate every vertex
a0
c0
d0
e0
b0
a1
c1
d1
e1
b1
2-lifts of graphs
a0
c0
d0
e0
b0
a1
d1
e1
b1
c1
for every pair of edges:
leave on either side (parallel),
or make both cross
2-lifts of graphs
for every pair of edges:
leave on either side (parallel),
or make both cross
a0
c0
d0
e0
b0
a1
d1
e1
b1
c1
2-lifts of graphs
for every pair of edges:
leave on either side (parallel),
or make both cross
a0
c0
d0
e0
b0
a1
d1
e1
b1
c1
2π possibilities
Eigenvalues of 2-lifts (Bilu-Linial)
Given a 2-lift of G,
create a signed adjacency matrix As
with a -1 for crossing edges
and a 1 for parallel edges
0 -1 0 0 1-1 0 1 0 10 1 0 -1 00 0 -1 0 11 1 0 1 0
a0
c0
d0
e0
b0
a1
d1
e1
b1
c1
Eigenvalues of 2-lifts (Bilu-Linial)
Theorem:The eigenvalues of the 2-lift are:
π1, β¦ , ππ = ππππ π΄βͺ
π1β² β¦ππ
β² = ππππ (π΄π )
0 -1 0 0 1-1 0 1 0 10 1 0 -1 00 0 -1 0 11 1 0 1 0
π΄π =
Eigenvalues of 2-lifts (Bilu-Linial)
Theorem:The eigenvalues of the 2-lift are the
union of the eigenvalues of A (old)and the eigenvalues of As (new)
Conjecture:Every d-regular graph has a 2-lift
in which all the new eigenvalueshave absolute value at most
Eigenvalues of 2-lifts (Bilu-Linial)
Theorem:The eigenvalues of the 2-lift are the
union of the eigenvalues of A (old)and the eigenvalues of As (new)
Conjecture:
Every d-regular adjacency matrix A
has a signing π΄π with ||π΄π|| β€ 2 π β 1
Eigenvalues of 2-lifts (Bilu-Linial)
Theorem:The eigenvalues of the 2-lift are the
union of the eigenvalues of A (old)and the eigenvalues of As (new)
Conjecture:
Every d-regular adjacency matrix A
has a signing π΄π with ||π΄π|| β€ 2 π β 1
Bilu-Linialβ06: This is true with π( π log3 π)
Eigenvalues of 2-lifts (Bilu-Linial)
Conjecture:Every d-regular adjacency matrix A
has a signing π΄π with ||π΄π|| β€ 2 π β 1
We prove this in the bipartite case.
Eigenvalues of 2-lifts (Bilu-Linial)
Theorem:Every d-regular adjacency matrix A
has a signing π΄π with π1(π΄π) β€ 2 π β 1
Eigenvalues of 2-lifts (Bilu-Linial)
Theorem:Every d-regular bipartite adjacency matrix A
has a signing π΄π with ||π΄π|| β€ 2 π β 1
Trick: eigenvalues of bipartite graphsare symmetric about 0, so only need to bound largest
Random Signings
Idea 1: Choose π β β1,1 π randomly.
Random Signings
Idea 1: Choose π β β1,1 π randomly.
Unfortunately,
(Bilu-Linial showed when
A is nearly Ramanujan )
Random Signings
Idea 2: Observe thatwhere
Usually useless, but not here!
is an interlacing family.
such that
Consider
Random Signings
Idea 2: Observe thatwhere
Usually useless, but not here!
is an interlacing family.
such that
Consider
Random Signings
Idea 2: Observe thatwhere
Usually useless, but not here!
is an interlacing family.
such that
Consider
coefficient-wise average
Random Signings
Idea 2: Observe thatwhere
Usually useless, but not here!
is an interlacing family.
such that
Consider
3-Step Proof Strategy
1. Show that some poly does as well as the .
such that
3-Step Proof Strategy
1. Show that some poly does as well as the .
such that
NOT ππππ₯ ππ΄π β€ πΌππππ₯(ππ΄π
)
3-Step Proof Strategy
1. Show that some poly does as well as the .
2. Calculate the expected polynomial.
such that
3-Step Proof Strategy
1. Show that some poly does as well as the .
2. Calculate the expected polynomial.
3. Bound the largest root of the expected poly.
such that
3-Step Proof Strategy
1. Show that some poly does as well as the .
2. Calculate the expected polynomial.
3. Bound the largest root of the expected poly.
such that
3-Step Proof Strategy
1. Show that some poly does as well as the .
2. Calculate the expected polynomial.
3. Bound the largest root of the expected poly.
such that
Step 2: The expected polynomial
Theorem [Godsil-Gutmanβ81]
For any graph G,
the matching polynomial of G
The matching polynomial
(Heilmann-Lieb β72)
mi = the number of matchings with i edges
one matching with 0 edges
7 matchings with 1 edge
Proof that
x Β±1 0 0 Β±1 Β±1
Β±1 x Β±1 0 0 0
0 Β±1 x Β±1 0 0
0 0 Β±1 x Β±1 0
Β±1 0 0 Β±1 x Β±1
Β±1 0 0 0 Β±1 x
Expand using permutations
Proof that
x Β±1 0 0 Β±1 Β±1
Β±1 x Β±1 0 0 0
0 Β±1 x Β±1 0 0
0 0 Β±1 x Β±1 0
Β±1 0 0 Β±1 x Β±1
Β±1 0 0 0 Β±1 x
Expand using permutations
same edge: same value
Proof that
x Β±1 0 0 Β±1 Β±1
Β±1 x Β±1 0 0 0
0 Β±1 x Β±1 0 0
0 0 Β±1 x Β±1 0
Β±1 0 0 Β±1 x Β±1
Β±1 0 0 0 Β±1 x
Expand using permutations
same edge: same value
Proof that
x Β±1 0 0 Β±1 Β±1
Β±1 x Β±1 0 0 0
0 Β±1 x Β±1 0 0
0 0 Β±1 x Β±1 0
Β±1 0 0 Β±1 x Β±1
Β±1 0 0 0 Β±1 x
Expand using permutations
Get 0 if hit any 0s
Proof that
x Β±1 0 0 Β±1 Β±1
Β±1 x Β±1 0 0 0
0 Β±1 x Β±1 0 0
0 0 Β±1 x Β±1 0
Β±1 0 0 Β±1 x Β±1
Β±1 0 0 0 Β±1 x
Expand using permutations
Get 0 if take just one entry for any edge
Proof that
x Β±1 0 0 Β±1 Β±1
Β±1 x Β±1 0 0 0
0 Β±1 x Β±1 0 0
0 0 Β±1 x Β±1 0
Β±1 0 0 Β±1 x Β±1
Β±1 0 0 0 Β±1 x
Expand using permutations
Only permutations that count are involutions
Proof that
x Β±1 0 0 Β±1 Β±1
Β±1 x Β±1 0 0 0
0 Β±1 x Β±1 0 0
0 0 Β±1 x Β±1 0
Β±1 0 0 Β±1 x Β±1
Β±1 0 0 0 Β±1 x
Expand using permutations
Only permutations that count are involutions
Proof that
x Β±1 0 0 Β±1 Β±1
Β±1 x Β±1 0 0 0
0 Β±1 x Β±1 0 0
0 0 Β±1 x Β±1 0
Β±1 0 0 Β±1 x Β±1
Β±1 0 0 0 Β±1 x
Expand using permutations
Only permutations that count are involutions
Correspond to matchings
Proof that
x Β±1 0 0 Β±1 Β±1
Β±1 x Β±1 0 0 0
0 Β±1 x Β±1 0 0
0 0 Β±1 x Β±1 0
Β±1 0 0 Β±1 x Β±1
Β±1 0 0 0 Β±1 x
Expand using permutations
Only permutations that count are involutions
Correspond to matchings
3-Step Proof Strategy
1. Show that some poly does as well as the .
2. Calculate the expected polynomial.
[Godsil-Gutmanβ81]
3. Bound the largest root of the expected poly.
such that
3-Step Proof Strategy
1. Show that some poly does as well as the .
2. Calculate the expected polynomial.
[Godsil-Gutmanβ81]
3. Bound the largest root of the expected poly.
such that
The matching polynomial
(Heilmann-Lieb β72)
Theorem (Heilmann-Lieb)
all the roots are real
The matching polynomial
(Heilmann-Lieb β72)
Theorem (Heilmann-Lieb)
all the roots are real
and have absolute value at most
The matching polynomial
(Heilmann-Lieb β72)
Theorem (Heilmann-Lieb)
all the roots are real
and have absolute value at most
The number 2 π β 1 comes by comparing
to an infinite π βary tree [Godsil].
3-Step Proof Strategy
1. Show that some poly does as well as the .
2. Calculate the expected polynomial.
[Godsil-Gutmanβ81]
3. Bound the largest root of the expected poly.
[Heilmann-Liebβ72]
such that
3-Step Proof Strategy
1. Show that some poly does as well as the .
2. Calculate the expected polynomial.
[Godsil-Gutmanβ81]
3. Bound the largest root of the expected poly.
[Heilmann-Liebβ72]
such that
3-Step Proof Strategy
1. Show that some poly does as well as the .
Implied by:
β is an interlacing family.β
such that
Averaging Polynomials
Basic Question: Given when are the roots of the related to roots of ?
Averaging Polynomials
Basic Question: Given when are the roots of the related to roots of ?
Answer: Certainly not always
Averaging Polynomials
Basic Question: Given when are the roots of the related to roots of ?
Answer: Certainly not alwaysβ¦
π π₯ = π₯ β 1 π₯ β 2 = π₯2 β 3π₯ + 2
π π₯ = π₯ β 3 π₯ β 4 = π₯2 β 7π₯ + 12
π₯ β 2.5 + 3π π₯ β 2.5 β 3π = π₯2 β 5π₯ + 7
1
2Γ
1
2Γ
Averaging Polynomials
Basic Question: Given when are the roots of the related to roots of ?
But sometimes it works:
A Sufficient Condition
Basic Question: Given when are the roots of the related to roots of ?
Answer: When they have a common interlacing.
Definition. interlaces
if
Theorem. If have a common interlacing,
Theorem. If have a common interlacing,
Proof.
Theorem. If have a common interlacing,
Proof.
Theorem. If have a common interlacing,
Proof.
Theorem. If have a common interlacing,
Proof.
Theorem. If have a common interlacing,
Proof.
Theorem. If have a common interlacing,
Proof.
Interlacing Family of Polynomials
Definition: is an interlacing family
if can be placed on the leaves of a tree so thatwhen every node is the sum of leaves below &sets of siblings have common interlacings
Interlacing Family of Polynomials
Definition: is an interlacing family
if can be placed on the leaves of a tree so thatwhen every node is the sum of leaves below &sets of siblings have common interlacings
Interlacing Family of Polynomials
Definition: is an interlacing family
if can be placed on the leaves of a tree so thatwhen every node is the sum of leaves below &sets of siblings have common interlacings
Interlacing Family of Polynomials
Definition: is an interlacing family
if can be placed on the leaves of a tree so thatwhen every node is the sum of leaves below &sets of siblings have common interlacings
Interlacing Family of Polynomials
Definition: is an interlacing family
if can be placed on the leaves of a tree so thatwhen every node is the sum of leaves below &sets of siblings have common interlacings
Interlacing Family of Polynomials
Theorem: There is an s so that
Proof: By common interlacing, one of ,has
Interlacing Family of Polynomials
Theorem: There is an s so that
Proof: By common interlacing, one of ,has
Interlacing Family of Polynomials
Theorem: There is an s so that
Proof: By common interlacing, one of ,has
Interlacing Family of Polynomials
Theorem: There is an s so that
Proof: By common interlacing, one of ,has β¦.
Interlacing Family of Polynomials
Theorem: There is an s so that
Proof: By common interlacing, one of ,has β¦.
Interlacing Family of Polynomials
Theorem: There is an s so that
An interlacing family
Theorem:
Let
is an interlacing family
To prove interlacing family
Let
Leaves of tree = signings π 1, β¦ , π π
Internal nodes = partial signings π 1, β¦ , π π
To prove interlacing family
Let
Leaves of tree = signings π 1, β¦ , π π
Internal nodes = partial signings π 1, β¦ , π π
Need to find common interlacing for every
internal node
How to Prove Common Interlacing
Lemma (Fiskβ08, folklore): Suppose π(π₯) and π(π₯) are monic and real-rooted. Then:
β a common interlacing π of π and π
β convex combinations,πΌπ + 1 β πΌ π
has real roots.
To prove interlacing family
Let
Need to prove that for all ,
is real rooted
To prove interlacing family
Need to prove that for all ,
are fixed
is 1 with probability , -1 with
are uniformly
is real rooted
Let
Generalization of Heilmann-Lieb
Suffices to prove that
is real rooted
for every product distributionon the entries of s
Generalization of Heilmann-Lieb
Suffices to prove that
is real rooted
for every product distributionon the entries of s:
Transformation to PSD Matrices
Suffices to show real rootedness of
Transformation to PSD Matrices
Suffices to show real rootedness of
Why is this useful?
Transformation to PSD Matrices
Suffices to show real rootedness of
Why is this useful?
Transformation to PSD Matrices
π£ππ = πΏπ β πΏπ with probability Ξ»ππ
πΏπ + πΏπ with probability (1βπππ)
Transformation to PSD Matrices
πΌπ det π₯πΌ β ππΌ β π΄π = πΌdet π₯πΌ β
ππβπΈ
π£πππ£πππ
where
Master Real-Rootedness Theorem
Given any independent random vectors π£1, β¦ , π£π β βπ, their expected characteristic polymomial
has real roots.
πΌdet π₯πΌ β
π
π£ππ£ππ
Master Real-Rootedness Theorem
Given any independent random vectors π£1, β¦ , π£π β βπ, their expected characteristic polymomial
has real roots.
πΌdet π₯πΌ β
π
π£ππ£ππ
How to prove this?
The Multivariate Method
A. Sokal, 90βs-2005:ββ¦it is often useful to consider the multivariate polynomial β¦ even if one is ultimately interested in a particular one-variable specializationβ
Borcea-Branden 2007+: prove that univariatepolynomials are real-rooted by showing that they are nice transformations of real-rootedmultivariate polynomials.
Real Stable Polynomials
Definition. π β β π₯1, β¦ , π₯π is real stable if every univariate restriction in the strictly positive orthant:
π π‘ β π π₯ + π‘ π¦ π¦ > 0
is real-rooted.
If it has real coefficients, it is called real stable.
Definition. π β β π₯1, β¦ , π₯π is real stable if every univariate restriction in the strictly positive orthant:
π π‘ β π π₯ + π‘ π¦ π¦ > 0
is real-rooted.
Real Stable Polynomials
Definition. π β β π₯1, β¦ , π₯π is real stable if every univariate restriction in the strictly positive orthant:
π π‘ β π π₯ + π‘ π¦ π¦ > 0
is real-rooted.
Real Stable Polynomials
Definition. π β β π₯1, β¦ , π₯π is real stable if every univariate restriction in the strictly positive orthant:
π π‘ β π π₯ + π‘ π¦ π¦ > 0
is real-rooted.
Real Stable Polynomials
Definition. π β β π₯1, β¦ , π₯π is real stable if every univariate restriction in the strictly positive orthant:
π π‘ β π π₯ + π‘ π¦ π¦ > 0
is real-rooted.
Real Stable Polynomials
Definition. π β β π₯1, β¦ , π₯π is real stable if every univariate restriction in the strictly positive orthant:
π π‘ β π π₯ + π‘ π¦ π¦ > 0
is real-rooted.
Not positive
Real Stable Polynomials
A Useful Real Stable Poly
Borcea-BrΓ€ndΓ©n β08:
For PSD matrices
is real stable
A Useful Real Stable Poly
Borcea-BrΓ€ndΓ©n β08:
For PSD matrices
is real stable
Proof: Every positive univariate restriction is the characteristic polynomial of a symmetric matrix.
det
π
π₯ππ΄π + π‘
π
π¦ππ΄π = det(π‘πΌ + π)
If is real stable, then so is
1. π(πΌ, π§2, β¦ , π§π) for any πΌ β β
2. 1 β ππ§ππ(π§1, β¦ π§π) [Lieb-Sokalβ81]
Excellent Closure Properties
is real stable if for all i
Implies .
Definition:
A Useful Real Stable Poly
Borcea-BrΓ€ndΓ©n β08:
For PSD matrices
is real stable
Plan: apply closure properties to this
to show that πΌdet π₯πΌ β π π£ππ£ππ is real stable.
Central Identity
Suppose π£1, β¦ , π£π are independent random vectors with π΄π β πΌπ£ππ£π
π. Then
πΌdet π₯πΌ β
π
π£ππ£ππ
=
π=1
π
1 βπ
ππ§πdet π₯πΌ +
π
π§ππ΄π
π§1=β―=π§π=0
Central Identity
Suppose π£1, β¦ , π£π are independent random vectors with π΄π β πΌπ£ππ£π
π. Then
πΌdet π₯πΌ β
π
π£ππ£ππ
=
π=1
π
1 βπ
ππ§πdet π₯πΌ +
π
π§ππ΄π
π§1=β―=π§π=0
Key Principle: random rank one updates β‘ (1 β ππ§) operators.
Proof of Master Real-Rootedness Theorem
Suppose π£1, β¦ , π£π are independent random vectors with π΄π β πΌπ£ππ£π
π. Then
πΌdet π₯πΌ β
π
π£ππ£ππ
=
π=1
π
1 βπ
ππ§πdet π₯πΌ +
π
π§ππ΄π
π§1=β―=π§π=0
Proof of Master Real-Rootedness Theorem
Suppose π£1, β¦ , π£π are independent random vectors with π΄π β πΌπ£ππ£π
π. Then
πΌdet π₯πΌ β
π
π£ππ£ππ
=
π=1
π
1 βπ
ππ§πdet π₯πΌ +
π
π§ππ΄π
π§1=β―=π§π=0
Proof of Master Real-Rootedness Theorem
Suppose π£1, β¦ , π£π are independent random vectors with π΄π β πΌπ£ππ£π
π. Then
πΌdet π₯πΌ β
π
π£ππ£ππ
=
π=1
π
1 βπ
ππ§πdet π₯πΌ +
π
π§ππ΄π
π§1=β―=π§π=0
Proof of Master Real-Rootedness Theorem
Suppose π£1, β¦ , π£π are independent random vectors with π΄π β πΌπ£ππ£π
π. Then
πΌdet π₯πΌ β
π
π£ππ£ππ
=
π=1
π
1 βπ
ππ§πdet π₯πΌ +
π
π§ππ΄π
π§1=β―=π§π=0
Proof of Master Real-Rootedness Theorem
Suppose π£1, β¦ , π£π are independent random vectors with π΄π β πΌπ£ππ£π
π. Then
πΌdet π₯πΌ β
π
π£ππ£ππ
=
π=1
π
1 βπ
ππ§πdet π₯πΌ +
π
π§ππ΄π
π§1=β―=π§π=0
The Whole Proof
πΌdet π₯πΌ β π π£ππ£ππ is real-rooted for all indep. π£π.
The Whole Proof
πΌdet π₯πΌ β π π£ππ£ππ is real-rooted for all indep. π£π.
πΌππ΄π (π β π₯) is real-rooted for all product
distributions on signings.
rank one structure naturally reveals interlacing.
The Whole Proof
πΌdet π₯πΌ β π π£ππ£ππ is real-rooted for all indep. π£π.
πΌππ΄π (π₯) is real-rooted for all product
distributions on signings.
The Whole Proof
πΌdet π₯πΌ β π π£ππ£ππ is real-rooted for all indep. π£π.
πΌππ΄π (π₯) is real-rooted for all product
distributions on signings.
ππ΄π π₯
π₯β Β±1 π is an interlacing family
The Whole Proof
πΌdet π₯πΌ β π π£ππ£ππ is real-rooted for all indep. π£π.
such that
πΌππ΄π (π₯) is real-rooted for all product
distributions on signings.
ππ΄π π₯
π₯β Β±1 π is an interlacing family
3-Step Proof Strategy
1. Show that some poly does as well as the .
2. Calculate the expected polynomial.
3. Bound the largest root of the expected poly.
such that
Infinite Sequences of Bipartite Ramanujan Graphs
Find an operation which doubles the size of a graph without blowing up its eigenvalues.
0[ ]-d d
[-2 π β 1
]2 π β 1
β¦β
Main Theme
Reduced the existence of a good matrix to:
1. Proving real-rootedness of an
expected polynomial.
2. Bounding roots of the expected
polynomial.
Main Theme
Reduced the existence of a good matrix to:1. Proving real-rootedness of an
expected polynomial. (rank-1 structure + real stability)
2. Bounding roots of the expectedpolynomial.
(matching poly + combinatorics)
Beyond complete graphs
Unweighted sparsifiers of general graphs?
G
H
Beyond complete graphs
1O(n)
G
H
Weights are Required in General
1O(n)
This edge has high resistance.
What if all edges are equally important?
?G
Unweighted Decomposition Thm.
G H1
H2
Theorem [MSSβ13]: If all edges have resistance π(π/π), there is a partition of G into unweighted 1 + π-
sparsifiers, each with ππ
π2 edges.
Unweighted Decomposition Thm.
G H1
H2
Theorem [MSSβ13]: If all edges have resistance β€ πΌ, there is a partition of G into unweighted π(1)-sparsifiers, each with π ππΌ edges.
Unweighted Decomposition Thm.
G H1
H2
Theorem [MSSβ13]: If all edges have resistance πΌ, there is a partition of G into two unweighted 1 + πΌ-
approximations, each with half as many edges.
Unweighted Decomposition Thm.
Theorem [MSSβ13]: Given any vectors π π£ππ£ππ = πΌ and
π£π β€ π, there is a partition into approximately
1/2-spherical quadratic forms, each πΌ
2Β± π π .
π
π£ππ£ππ = πΌ
π
π£ππ£ππ β
πΌ
2
π
π£ππ£ππ β
πΌ
2
Unweighted Decomposition Thm.
Theorem [MSSβ13]: Given any vectors π π£ππ£ππ = πΌ and
π£π β€ π, there is a partition into approximately
1/2-spherical quadratic forms, each πΌ
2Β± π π .
π
π£ππ£ππ = πΌ
π
π£ππ£ππ β
πΌ
2
π
π£ππ£ππ β
πΌ
2
Proof: Analyze expected charpoly of a random partition:
πΌdet π₯πΌ β π π£ππ£ππ det π₯πΌ β π π£ππ£π
π
Unweighted Decomposition Thm.
Theorem [MSSβ13]: Given any vectors π π£ππ£ππ = πΌ and
π£π β€ π, there is a partition into approximately
1/2-spherical quadratic forms, each πΌ
2Β± π π .
π
π£ππ£ππ = πΌ
π
π£ππ£ππ β
πΌ
2
π
π£ππ£ππ β
πΌ
2
Other applications:Kadison-Singer ProblemUncertainty principles.
Summary of Algorithms
Result Edges Weights Time
Spielman-Sβ08 O(nlogn) Yes O~(m)
Random Sampling with Effective Resistances
Summary of Algorithms
Result Edges Weights Time
Spielman-Sβ08 O(nlogn) Yes O~(m)
Batson-Spielman-Sβ09 O(n) Yes π(π4)
Summary of Algorithms
Result Edges Weights Time
Spielman-Sβ08 O(nlogn) Yes O~(m)
Batson-Spielman-Sβ09 O(n) Yes π(π4)
Marcus-Spielman-Sβ13 O(n) No π(2π)
πΌdet π₯πΌ β
π
π£ππ£ππ det π₯πΌ β
π
π£ππ£ππ
Open Questions
Non-bipartite graphs
Algorithmic construction
(computing generalized ππΊ is hard)
More general uses of interlacing families
Open Questions
Nearly linear time algorithm for 4π/π2 size sparsifiers
Improve to 2π/π2 for graphs?
Fast combinatorial algorithm for approximating resistances