An Algorithmic Proof of the Lopsided Lovasz Local Lemma Nick Harvey University of British Columbia...

36
An Algorithmic Proof of the Lopsided Lovasz Local Lemma Nick Harvey University of British Columbia Jan Vondrak IBM Almaden

Transcript of An Algorithmic Proof of the Lopsided Lovasz Local Lemma Nick Harvey University of British Columbia...

An Algorithmic Proof of the Lopsided Lovasz Local

Lemma 

Nick Harvey University of British Columbia

Jan VondrakIBM Almaden

• Let E1,E2,…,En be events in a discrete probability space.

• Let G be a graph on {1,…,n}.Let ¡(i) be neighbors of i, and let ¡+(i) = ¡(i) [ {i}.

• G is called a dependency graph if, for all i,Ei is jointly independent of { Ej : j¡+(i) }.

• Erdos-Lovasz ‘75: Suppose |¡+(i)|· d and P[Ei] · p 8i.If ped· 1 then P[Åi Ei

c] ¸ (1-1/d)n.

Lovasz Local Lemma

E1

E2

E3

E4

E.g., Pairwise Independent Events

Example: k-SAT• Let Á be a k-SAT formula• Pick a uniformly random assignment to the

variables.• Let Ei be the event clause i is unsatisfied. Note

P[Ei]=2-k=p.

• Suppose that each variable appears in 2k/ek clauses.

• Dependency graph: |¡+(i)|· 2k/e = d.• Eg:

• Since ped· 1, LLL implies P[Åi Eic]>0, so Á is

satisfiable.

E1

Clause 1 unsatisfiedE2

Clause 2 unsatisfied

E3

Clause 3 unsatisfiedE4

Clause 4 unsatisfied

Á = (x1Çx2Çx3) Æ (x1Çx4Çx5) Æ (x4Çx5Çx6) Æ

(x2Çx3Çx6)

Stronger forms of LLL

• Symmetric LLL: P[Ei] · 1/(e¢maxj |¡+

(j)|) 8i.• Asymmetric LLL: j2¡+(i) P[Ej] · 1/4

8i.• General LLL: 9 x1,...,xn 2 (0,1)

such thatP[Ei] · xi ¢ j2¡(i) (1-xj) 8i.

• Symmetric LLL: P[Ei] · 1/(e¢maxj |¡+

(j)|) 8i.• Asymmetric LLL: j2¡+(i) P[Ej] · 1/4

8i.• General LLL: 9 y1,...,yn > 0 such

thatP[Ei] · yi / j2¡+(i) (1+yj) 8i.

Stronger forms of LLL

• Symmetric LLL: P[Ei] · 1/(e¢maxj |¡+

(j)|) 8i.• Asymmetric LLL: j2¡+(i) P[Ej] · 1/4

8i.• General LLL: 9 y1,...,yn > 0 such

thatP[Ei] · yi / Jµ¡+(i) yJ 8i.

where yJ = j2J yj.

• Let Ind be the independent sets of the dependency graph.

• Cluster Expansion: 9 y1,...,yn > 0 such that

P[Ei] · yi / Jµ¡+(i) yJ 8i.

J 2 Ind

Stronger forms of LLL

• Let Ind be the independent sets of the dependency graph.

• Let pi = P[Ei].

• For SµV, let

• Shearer ‘85: If qI>0 8I2Ind then P[Åi Eic]

¸ q; > 0.Moreover, for every dependency graph, this is theoptimal criterion under which the LLL is true.

“multivariateindependencepolynomial”

Stronger forms of LLL

Algorithmic Results• LLL gives a non-constructive proof that

Åi Eic ;.

• Can we efficiently find a point in that set?

• Long history:Beck ‘91, Alon ‘91, Molloy-Reed ‘98, Czumaj-Scheideler ‘00, Srinivasan ‘08

• Can find a satisfying assignment for k-SAT instanceswhere each variable appears in · 2k/7 clauses.

Algorithmic Results• Efficiently finding a point in Åi Ei

c.

• Moser-Tardos ‘10: Algorithm for General LLL in “variable model”.– Probability space has product distribution

on variables.– Each events is a function of some subset

of the variables.– Dependency graph connects events that

share variables.

• Many extensions: (but still in variable model)

– Cluster Expansion criterion [Pegden ‘13]– Shearer’s criterion [Kolipaka-Szegedy

‘11]

Algorithmic Results• Efficiently finding a point in Åi Ei

c.

• Moser-Tardos ‘10: Algorithm for General LLL in “variable model”.– Probability space has product distribution

on variables.– Each events is a function of some subset

of the variables.– Dependency graph connects events that

share variables.

• Many extensions in other directions– Derandomization [Chandrasekaran et al.

‘10]– Exponentially many events [Haeupler,

Saha, Srinivasan ‘10]– Column-sparse IPs [Harris, Srinivasan

‘13]– Distributed algorithms [Chung, Pettie, Su

‘14]

Extending LLL Beyond Dependency Graphs

• The definition of a dependency graph is equivalent to

P[Ei | Åj2J Ejc ] = P[Ei] 8J µ [n]n¡+(i).

• Define a lopsidependency graph to be one satisfying

P[Ei | Åj2J Ejc ] · P[Ei] 8J µ [n]n¡+(i).

• Intuitively, edges between events that are “negatively dependent”

• (Easy) Theorem: (Erdos-Spencer ‘91) All existential forms of the LLL remain true with lopsidependency graphs.

J = any set of non-neighbors

Extending LLL Beyond Dependency Graphs

• The definition of a dependency graph is equivalent to

P[Ei | Åj2J Ejc ] = P[Ei] 8J µ [n]n¡+(i).

• Define a lopsidependency graph to be one satisfying

P[Ei | Åj2J Ejc ] · P[Ei] 8J µ [n]n¡+(i).

• Intuitively, edges between events that are “negatively dependent”

• Eg: Let ¼ be a permutation on {1,2}.Let E1 be “¼(1)=1” and E2 be “¼(2)=2”.Then P[E1 | E2

c ] = 0 < 1/2 = P[E1 ].

J = any set of non-neighbors

E1 and E2 are “positively

dependent”

Extending LLL Beyond Dependency Graphs

• The definition of a dependency graph is equivalent to

P[Ei | Åj2J Ejc ] = P[Ei] 8J µ [n]n¡+(i).

• Define a lopsidependency graph to be one satisfying

P[Ei | Åj2J Ejc ] · P[Ei] 8J µ [n]n¡+(i).

• Intuitively, edges between events that are “negatively dependent”

• Eg: Let ¼ be permutation on [n]. [Erdos-Spencer ‘91, Harris-Srinivasan ‘14]Let Ei be “¼(ai)=bi”.Add edges between Ei and Ej if ai=aj or bi=bj.

J = any set of non-neighbors

• Define a lopsidependency graph to be one satisfying

P[Ei | Åj2J Ejc ] · P[Ei] 8J µ [n]n¡+(i).

• Definition: A lopsided association graph is a graph where

P[Ei |F ] ¸ P[Ei] 8F2Fi,where Fi contains all monotone functions of { Ei :

j¡+(i) }.

• Easy: Lopsided association ) Lopsidependency.

• Analogous to “negative association” vs “negative cylinder dependent”

Lopsided Association

• Define a lopsidependency graph to be one satisfying

P[Ei | Åj2J Ejc ] · P[Ei] 8J µ [n]n¡+(i).

• Definition: A lopsided association graph is a graph where

P[Ei |F ] ¸ P[Ei] 8F2Fi,where Fi contains all monotone functions of { Ei :

j¡+(i) }.

• Easy: Lopsided association ) Lopsidependency.

• Main Result: For every probability space with a lopsided association graph, all existential forms of LLL have an algorithmic proof.

Lopsided Association

Algorithmic LLL

Sym

metr

ic

Gen

era

l

Clu

ster

Exp

an

sion

Sheare

rVariable model

Dependency graphLopsided

association graphLopsidepend

ency graph Mos

er-Ta

rdos

Pegd

enKo

lipak

a-Sz

eged

y

Other Distributions

Sym

metr

ic

Gen

era

l

Clu

ster

Exp

an

sion

Sheare

rVariable model

Dependency graphLopsided

association graphLopsidepend

ency graph Mos

er-Ta

rdos

Pegd

enKo

lipak

a-Sz

eged

y

Permutations

Harris

-Srin

ivas

an

Algorithmic LLL beyond variable model

Sym

metr

ic

Gen

era

l

Clu

ster

Exp

an

sion

Sheare

rVariable model

Perfect Matchings in K n

Dependency graphLopsided

association graphLopsidepend

ency graph Mos

er-Ta

rdos

Pegd

enKo

lipak

a-Sz

eged

y

PermutationsAc

hlio

ptas

-

Iliop

oulo

s

Spanning Trees in K n?

Algorithmic LLL beyond variable model

* Needs slack+ Hamilton cycles in Kn...

Sym

metr

ic

Gen

era

l

Clu

ster

Exp

an

sion

Sheare

rVariable model

Perfect Matchings in K n

Dependency graphLopsided

association graphLopsidepend

ency graph Mos

er-Ta

rdos

Pegd

enKo

lipak

a-Sz

eged

y

Permutations

Achl

iopt

as-

Iliop

oulo

s

Spanning Trees in K n? * Needs

slack+ Hamilton. Cycles...

Algorithmic LLL beyond variable model

Algorithmic LLL beyond variable model

Sym

metr

ic

Gen

era

l

Clu

ster

Exp

an

sion

Sheare

rVariable model

Perfect Matchings in K n

Dependency graphLopsided

association graphLopsidepend

ency graph

Permutations

Spanning Trees in K n

All probability spaces

Our Results

* Need slackin Shearer’scriterion

* Intractability?

Resampling Oracles• Need some algorithmic access to probability

space ¹• Def: A resampling oracle for Ei is ri : ! satisfying– (R1) If !»¹|Ei then ri(!)»¹

Removes conditioning on Ei

– (R2) If j¡+(i) and Ej does not occur in !, then Ej does not occur in ri(!).

Cannot cause non-neighbors to occur

• Roughly, ri implements a coupling between ¹|Ei and ¹such that (R2) holds.

Resampling Oracles• Need some algorithmic access to

probability space ¹• Def: A resampling oracle for Ei is ri :

! satisfying– (R1) If !»¹|Ei then ri(!)»¹

– (R2) If j¡+(i) and Ej does not occur in !, then Ej does not occur in ri(!).

• Eg: Moser-Tardos Variable modelE1

Clause 1 unsatisfiedE2

Clause 2 unsatisfied

E3

Clause 3 unsatisfiedE4

Clause 4 unsatisfied

Á = (x1Çx2Çx3) Æ (x1Çx4Çx5) Æ (x4Çx5Çx6) Æ

(x2Çx3Çx6)ri(!): resample all variables in clause i.

Resampling Oracles• Need some algorithmic access to probability space

¹• Def: A resampling oracle for Ei is ri : ! satisfying– (R1) If !»¹|Ei then ri(!)»¹

– (R2) If j¡+(i) and Ej does not occur in !, then Ej does not occur in ri(!).

• Theorem: For any probability space and any events, TFAE– Existence of a lopsided association graph– Existence of resampling oracles.

• No guarantees it is compactly representable or efficient.What if events involve some NP-hard problem?

Resampling Oracles• Def: A resampling oracle for Ei is ri : !

satisfying– (R1) If !»¹|Ei then ri(!)»¹

– (R2) If j¡+(i) and Ej does not occur in !, then Ej does not occur in ri(!).

• Theorem: For any probability space and any events, TFAE– Existence of a lopsided association graph– Existence of resampling oracles.

• No guarantees it is compactly representable or efficient.What if events involve some NP-hard problem?

Spanning Trees in Kn

• Let T be a uniformly random spanning tree in Kn

• For edge set A, let EA = { AµT }.

• Lopsidependency Graph: Make EA

a neighbor of EB, unless A and B are vertex-disjoint.

• Resampling oracle rA:

– If T uniform conditioned on EA, want rA(T ) uniform.

–But, should not disturb edges that are vtx-disjoint from A.

Dependency

• EA = { AµT }.

• Resampling oracle rA(T ):

– If T uniform conditioned on EA, want rA(T ) uniform.

–But, should not disturb edges that are vtx-disjoint from A.A

T

–Contract edges of Tvtx-disjoint from A

–Delete remaining edgesvtx-disjoint from A

–Let rA(T ) be a uniformly random spanning tree in resulting graph.

rA(T )

Main Theorem

• Given–any probability space and events–a lopsided association graph– resampling oracles for the events

• Suppose that General LLL, Cluster Expansion or Shearer* holds.

• There is an algorithm to find a point in Åi Ei

c,running in oracle-polynomialy time, whp.

* Need slack for Shearer’s condition.y Actually depending on parameters in General LLL, etc.

Main Theorem• Given–any probability space and events–a lopsided association graph– resampling oracles for the events

• General LLL: 9 x1,...,xn 2 (0,1) such thatP[Ei] · xi ¢ j2¡(i) (1-xj) 8i.

• There is an algorithm to find a point in Åi

Eic,

using oracle calls, whp.

Main Theorem• Given–any probability space and events–a lopsided association graph– resampling oracles for the events

• Cluster Expansion: 9 y1,...,yn > 0 such that

P[Ei] · yi / Jµ¡+(i) yJ 8i.

• There is an algorithm to find a point in Åi Eic,

using oracle calls, whp.

J 2 Ind

The AlgorithmMaximalSetResample

Draw ! from ¹Repeat

J Ã ;While there is i¡+(J) s.t. Ei occurs

in !Pick smallest such iJ Ã J [ {i}! Ã ri(!)

EndUntil J = ;

• Similar to parallel form of Moser-Tardos

AnalysisReview of Moser-Tardos

E3

E9

E2

E5

E1

E7

E1

E3

E9

“Log” or “History” of resampling operations

Start

Dependency graph edges

AnalysisReview of Moser-Tardos

E3

E9

E2

E5

E1

E7

E1

E3

E9

“Log” or “History” of resampling operations

Start

Dependency graph edges

Witness tree

MaximalSetResampleDraw ! from ¹Repeat

J Ã ;While there is i¡+(J) s.t. Ei occurs

in !Pick smallest such iJ Ã J [ {i}! Ã ri(!)

EndUntil J = ;

J1, J2, J3, …Resampling sequence:

Ji+1 µ ¡+(Ji)Ji 2 Ind,

• Trivially, E[ # iterations ] = § Pr[ I1,I2,… is a prefix of resampling sequence ]

• Claim: Pr[ I1,…,Ik is a prefix ] · 1·i·k pIi

• Like Moser-Tardos, this is a coupling argument.But instead of using fact that variable have “fresh samples” whenever examined, we use the fact that ri returns thestate to the underlying distribution ¹.

I1,I2,…

• Trivially, E[ # iterations ] = § Pr[ I1,I2,… is a prefix of resampling sequence ]

• Claim: Pr[ I1,…,Ik is a prefix ] · 1·i·k pIi

• Proof: Like Moser-Tardos, using ri returns state to ¹.

• Claim: Let P[Ei] · (1-²) xi ¢ j2¡(i) (1-xj) (General LLL)Then § 1·i·k pIi ·

• Proof: Key idea: inductively show that § 1·i·k pIi ·

I1,I2,…

I1,…,Ik

I1,…,IkI1 = J

Comparison to Fotis’ work• Pros for Fotis:– Don’t need to sample from ¹ (there is no

¹)– Can handle scenarios that are not even

a lopsidependency graph,e.g., Hamilton cycles, Vtx Coloring.

• Pros for us:– Characterize when resampling

operations exist– Gives new proof of General LLL with

Dependency Graphs,in full generality. (Even for Shearer & Lopsided Association Graphs).

– Don’t need any slack in General LLL or Cluster Expansion conditions

– “Fractional transitions” seem easier in our setting