Transcript of Oded Goldreich Shafi Goldwasser Dana Ron February 13, 1998 Max-Cut Property Testing by Ori Rosen.
- Slide 1
- Oded Goldreich Shafi Goldwasser Dana Ron February 13, 1998
Max-Cut Property Testing by Ori Rosen
- Slide 2
- Table of Contents 1. Introduction 2. Definitions 3. Graph
Partitioning Algorithm 4. Max-Cut Approximation Algorithm 5.
Property Testing MC
- Slide 3
- Introduction Definition of the max-cut problem: For a graph, a
maximum cut is a cut whose size is not smaller than the size of any
other cut. Size cut definition: For graph G, the size of a cut is
the number of edges between S, a subset of V(G), and the
complementary subset.
- Slide 4
- Introduction We will see a property testing of Max-Cut. First,
we are shown an algorithm that ideally finds the max-cut by
cheating with additional information. The above algorithm is then
modified to approximate the size of the Max-Cut without further
information. Last, well see a usage of the Max-Cut algorithm to
test for MC . * We will consider our graphs to be dense (with at
least N 2 edges), the input graph is given in terms of its
adjacency matrix, and a query consists of verifying if the edge
(u,v) belongs to E(G).
- Slide 5
- Definitions Edge density definition: For a given partition (V
1,V 2 ) of V(G), we define (V 1,V 2 ) to be the edge density of the
cut defined by (V1,V2) Let (G) denote the edge density of the
largest cut in G, meaning it is the largest (V 1,V 2 ) taken over
all partitions of (V 1,V 2 ) of V(G).
- Slide 6
- Definitions (G) = 6/4 2 = 6/16 (G) = ?
- Slide 7
- Graph Partitioning Algorithm We will now see a quadratic-time
graph partitioning algorithm, with running time : Given a graph G,
the algorithm returns a cut with edge density of at least (G) ()*,
with probability at least 1 - .
- Slide 8
- Graph Partitioning Algorithm Let, and let (V 1,,V l ) be a
fixed partition of V(G) into l sets of (roughly) equal size. The
algorithm constructs a partition (V 1,V 2 ) in l iterations. In the
ith iteration we construct a partition (V 1 i,V 2 i ) of V i.
- Slide 9
- Observation: Let (H 1,H 2 ) be a partition of V(G). Let v in H1
and assume that the number of neighbors of v in H 2 is the same or
more than in H 1, meaning Then, if we move v from H 1 to H 2, we
cannot decrease the edge density, but we might increase it =>
Graph Partitioning Algorithm The addition to edge density from new
edges brought into the cut from moving v to H 2
- Slide 10
- Graph Partitioning Algorithm Expanding on the previous
observation, we will see what happens when we move (N) vertices. In
contrast to moving a single vertex, the cut may decrease by O( 2 N
2 ) per stage. The algorithm is oracle-aided and will work in O(1/)
stages. It will be viewed as starting from a partition that matches
a Max-Cut, and every stage moves O(N) vertices. The total decrease
in cut size is bounded by O((1/)* 2 N) = O(N).
- Slide 11
- Let X be a subset of V(G) of size N/l (assume integer). Let W =
V(G) \ X, and let (W 1,W 2 ) be a partition of W induced by (H 1,H
2 ). => Remember, consider (H 1,H 2 ) being the Max-Cut. Assume
we know for every vertex x in X, |(x)|. Define X UB (UnBalanced) to
be the set of vertices that have many more (for example, ( /8)N)
neighbors on one side of the partition than on the other, with
respect to (W 1,W 2 ) In contrast, we define X B = X \ X UB to be
the set of balanced vertices. Graph Partitioning Algorithm
- Slide 12
- 1 5 2 6 3 7 4 8 9 13 10 14 11 15 12 16 SIMPLE!
- Slide 13
- Graph Partitioning Algorithm Assume we partition X into (X 1,X
2 ) in the following way Vertices in X UB that have more neighbors
in W 1 are put in X 2, and vice versa for W 2 and X 1. Vertices in
X B are placed arbitrarily in W 1 and W 2. Next we create a new
partition This partition differs from (H 1,H 2 ) only in the
placement of vertices in X.
- Slide 14
- Graph Partitioning Algorithm 1 5 2 6 3 7 4 8 9 13 10 14 11 15
12 16
- Slide 15
- Graph Partitioning Algorithm Reminder: => the difference
between (H 1 ,H 2 ) and (H 1,H 2 ) is only the change in number of
edges between vertices in X and vertices in W, and between pairs of
vertices in X. From the construction of X UB, the number of edges
crossing the cut between vertices in X UB and W cannot
decrease.
- Slide 16
- Graph Partitioning Algorithm Reminder: From the construction of
X b, the number of cut edges between X b and W can decrease by at
most The number of cut edges between pairs of vertices in X can
decrease by at most -
- Slide 17
- Graph Partitioning Algorithm (H 1,H 2 ) = 34/16 2 (H 1 ,H 2 ) =
32/16 2 1 5 2 6 3 7 4 8 9 13 10 14 11 15 12 16
- Slide 18
- Graph Partitioning Algorithm Let X be V 1, let (H 1,H 2 )
define a Max-cut, and let the resulting partition we received from
the process we just saw be defined by (H 1 1,H 2 1 ). Assume we
continue this process iteratively. During the ith iteration, we
process V i,given the partition (H 1 i-1,H 2 i-1 ) calculated in
iteration i-1. The result from this process is that (H 1 l,H 2 l )
is smaller than (H 1,H 2 ) = (G) by no more than
- Slide 19
- 1 5 2 6 3 7 4 8 9 13 10 14 11 15 12 16 Graph Partitioning
Algorithm And so on
- Slide 20
- 1. choose l = sets U 1,,U l each of size t = ( -2 log() -1 ),
where U i is chosen uniformly in V\V i. Let =. 2. For each sequence
of partitions of () = (where for each i, (U 1 i,U 2 i ) is a
partition of U i ) do: 1. For i = 1l, partition V i into two
disjoints V 1 i and V 2 i as follows: For each v in V i, 1. If then
put v in V 2 i. 2. Else put v in V 1 i. 2. Let V 1 () = V 1 i, and
let V 2 () = V 2 i. 3. Among all partitions (V 1 (), V 2 () ),
created in step (2), let (V 1 (), V 2 () ) be the one which defines
the largest cut, and output it. ~ ~
- Slide 21
- Graph Partitioning Algorithm The algorithm we just saw has one
little problem we dont know what the Max-Cut is to start from,
meaning (H 1 0,H 2 0 ). Because of this, we dont know if the
vertices in V 1 are balanced or not. What we can do, is approximate
the number of neighbors v has on each side of (W 1 0,W 2 0 ) by
sampling.
- Slide 22
- Graph Partitioning Algorithm We will see that if we uniformly
choose a set of vertices U 1 of size t = poly(log(1/ )/) in W 0
then with high probability over the choice of U 1 there exists a
partition (U 1 1,U 2 1 ) of U 1 which is representative with
respect to (W 1 0,W 2 0 ) and V 1 : For all but a small fraction of
vertices v in V 1, the number of neighbors v has in U 1 1, relative
to the size of U 1, is approx the same as the number of neighbors v
has in W 1 0, relative to the size of V(G). This approx is good
enough, since when placing vertices in V 1, the most important
factor is the location of the unbalanced vertices.
- Slide 23
- Graph Partitioning Algorithm If U 1 has a representative
partition, then we say that U 1 is good. How do we know which of
the 2 t partitions of U 1 is the representative one (if one
exists)? Easy we try them all.
- Slide 24
- 1 5 2 6 3 7 4 8 9 13 10 14 11 15 12 16 Graph Partitioning
Algorithm
- Slide 25
- 1 5 2 6 3 7 4 8 9 13 10 14 11 15 12 16 Graph Partitioning
Algorithm And so on
- Slide 26
- Out of all the partitions of U 1, namely (U 1 1,U 2 1 ), we
only need the partition for which - Denote this (hopefully
representative partition) by (U 1 1,U 2 1 ). Let (V 1 1,V 2 1 ) be
the partition of V 1 which is determined by this partition of U 1.
Let (H 1 1,H 2 1 ) be the resulting partition of V(G). => (H 1
1,H 2 1 ) is the same as (H 1 0,H 2 0 ) except for the placement of
vertices in V 1, which is as in (V 1 1,V 2 1 ).
- Slide 27
- Graph Partitioning Algorithm If (U 1 1,U 2 1 ) is the
representative one (in respect to (W 1 0,W 2 0 ) and V 1 ), then (H
1 1,H 2 1 ) is not much smaller than (H 1 0,H 2 0 ) = (G).
Continuing like this, in the ith stage we randomly pick a set U i,
and we determine a partition V i for each of its partitions. =>
Were actually constructing (2 t ) l =2 l*t possible partitions of
V(G), one for each partition of all the U i s.
- Slide 28
- Graph Partitioning Algorithm To show that at least one of these
partitions defines a cut close to the Max-Cut, we only need to make
sure that for each i, with high probability, U i is good with
respect to (W 1 i-1,W 2 i-1 ), where the latter partition is
determined by the choice of U 1,,U i-1, and their representative
partitions (U 1 1,U 2 1 ),,(U 1 i-1,U 2 i-1 ). Well see a lemma
that formalizes the intuition we saw before on why the algorithm
works.
- Slide 29
- Graph Partitioning Algorithm Lemma 1: Let (H 1,H 2 ) be a fixed
partition of V(G). Then with probability at least (1 - /2) over the
choice of =, there exists a sequence of partitions (), such that :
( V 1 ( ), V 2 ( ) ) (H 1,H 2 ) *. Proof follows.
- Slide 30
- Graph Partitioning Algorithm Lemma Proof: For a given sequence
of partitions (), we consider the following l+1 hybrid partitions.
The Hybrid (H 1 0,H 2 0 ) is simply (H 1,H 2 ). The ith hybrid
partition, (H 1 i,H 2 i ), has the vertices in V i+1,,V l
partitioned as in (H 1,H 2 ) and the vertices in V 1,,V i as placed
by the algorithm.
- Slide 31
- More precisely, the hybrid partition (H 1 i,H 2 i ) is defined:
Where for j in {1,2}, Note that in particular (H 1 l,H 2 l ) is the
partition ( V 1 (),V 2 () ). Since the partition of each V i is
determined by the choice of U i and its partition, the ith hybrid
partition is determined by the choice of U 1,,U i and their
partitions, but not by the choice nor the partitions of U i+1,,U l.
and Graph Partitioning Algorithm
- Slide 32
- We shall show that for every 1il, for any fixed choice and
partitions of U 1,,U i-1, with probability at least (1 /2*l) over
the choice of U i, there exists a partition (U 1 i,U 2 i ) of U i
such that:
- Slide 33
- Graph Partitioning Algorithm For the i-1 hybrid partition (H 1
i-1,H 2 i-1 ), or more precisely, for the partition it induces on W
i-1, and a sample set U i, let We say that U i good with respect to
(W 1 i-1,W 2 i-1 ) and V i if (U 1 i,U 2 i ) is representative with
respect to (W 1 i-1,W 2 i-1 ) and V i.
- Slide 34
- Graph Partitioning Algorithm That is, (U 1 i,U 2 i ) is such
that for all but a fraction of /8 of the vertices v in V i the
following holds: Assume that for each i, the set U i is good with
respect to (W 1 i-1,W 2 i-1 ) and V i. As previously defined, we
say that a vertex v is unbalanced with respect to (W 1 i-1,W 2 i-1
) if (*)(*)
- Slide 35
- Graph Partitioning Algorithm Thus, if v in V i is an unbalanced
vertex with respect to (W 1 i-1,W 2 i-1 ) for which (*) is
satisfied, then We are then guaranteed that when the partition (U 1
i,U 2 i ) is used then v is put opposite of the majority of its
neighbors in W i-1. if v is balanced then it might be placed on
either side. The same is true for the (at most N/8l) vertices for
which (*) does not hold.
- Slide 36
- Graph Partitioning Algorithm The decrease in the size of the
cut is affected only by the change of edges between V i and W i-1,
and between pairs of vertices in V i. In particular: The number of
cut edges between unbalanced vertices in V i for which (*) is
satisfied and vertices in W i-1 cant decrease. The number of cut
edges between unbalanced vertices in V i for which (*) is not
satisfied and vertices in W i-1 decrease by at most (/8)*|V i |*2N
N 2 /4l. The number of cut edges between balanced vertices in V i
and vertices in W i-1 decrease by at most |V i |*2* N/8 N 2 /4l The
number of cut edges between balanced vertices in V i decrease by at
most |V i | 2 = N 2 /l 2 N 2 /4l
- Slide 37
- The total decrease is bounded by 3 N 2 /4l. It remains to prove
that with high probability a chosen set U i is good (with respect
to (W 1 i-1,W 2 i-1 ) and V i ). We first fix a vertex v in V i.
Let U i = {u 1,,u t } (Reminder U i is chosen uniformly in W i-1 =
V \ V i ). For j in {1,2}, and for 1k t, define a 0/1 random
variable, j k, which is 1 if k is a neighbor of v and k in W j i-1,
and is 0 otherwise. Graph Partitioning Algorithm
- Slide 38
- By definition, for each j, the sum of the j k s is simply the
number of neighbors v has in U j i (= U i W j i-1 ) and the
probability that j k = 1 is (1/N)*|(v) W j i-1 |. By an additive
Chernoff bound, and our choice of t, for each j in {1,..,k} -
- Slide 39
- Graph Partitioning Algorithm By Markovs inequality, for each j
in {1,2}, with probability at least 1 - /4*l over the choice of U
i, for all but /8 of the vertices in V i, equation (*) holds (for
that j), and thus with probability at least 1 - /2l, U i is good as
required. Applying the lemma 1 to a Max-Cut of G, we get: With
probability at least 1 - /2 over the choice of we have, ( V 1 ( ),
V 2 ( ) ) (G) - *, where ( V 1 (),V 2 () ) is as defined in step 3
of the algorithm. ~~ ~ ~
- Slide 40
- Max-Cut Approx Algorithm Armed with the GPA, the Max-Cut approx
algorithm is quite straightforward. We uniformly choose a set S of
vertices of size m = ((l*t + log(1/))/ 2 ), and run the GPA
restricted to this set. Instead of returning the largest cut, the
algorithm returns S = {s 1,,s m }, a multiset of m/2 ordered pairs,
{(s 1,s 2 ),,(s m-1,s m )}, for which there exists a cut that
maximizes the number of such pairs that are edges in the cut. This
is done for technical reasons.
- Slide 41
- Max-Cut Approx algorithm 1. As step 1 of GPA. 2. Uniformly
choose a set S={s1,,sm} of size m = (lt+log(1/)/ 2 ). For 1il, let
S i = V i S. 3. Similar to step 2 of GPA, for each of the sequences
of partitions () =, partition each S i into two disjoint sets S 1 i
and S 2 i, and let S j () = (for j = 1,2). 4. For each partition (
S 1 (),S 2 () ), compute the fraction of cut edges between pairs of
vertices (s 2k-1,s 2k ). More precisely, define Let ( S 1 (),S 2 ()
) be a partition for which this fraction is maximized, and output
(S 1 (),S 2 () ). ^ ~ ~
- Slide 42
- Max-Cut Approx Algorithm Lemma 2: For any fixed , with
probability at least 1 /2 over the choice of S, ( S 1 (),S 2 () ) =
( V 1 (),V 2 () ) *, where (S 1 (),S 2 () ) and (,) are as defined
in step 4 of the Max-Cut approx algorithm. Proof follows. ^ ~ ~~ ~
~ ~
- Slide 43
- Max-Cut Approx Algorithm Lemma 2 proof: Consider first a
particular sequence of partitions, (). The key observation is that
for every s in S, and for j in {1,2}, s in S j () s in V j (). Thus
for each sequence of partitions () we are effectively sampling from
( V 1 (),V 2 () ). Furthermore, by viewing S as consisting of m/2
pairs of vertices (s 2k-1,s 2k ), and counting the number of pairs
which are on opposite sides of the partition and have an edge in
between, we are able to approx the density of the cut edges.
- Slide 44
- Max-Cut Approx Algorithm For 1km/2, let k be a 0/1 random
variable which is 1 if (s 2k-1,s 2k ) in E(G), and for jj, s 2k-1
in S j () and s 2k in S j (). Then, by definition, ( S 1 (),S 2 ()
) = 2/m k m/2 k, and the probability that k = 1 is ( V 1 (),V 2 ()
). Hence, by an additive Chernoff bound and our choice of m -
^
- Slide 45
- Max-Cut Approx Algorithm Since there are 2 l*t sequences of
partitions of , with probability at least 1 /2, for every sequence
of partitions (), ( S 1 (),S 2 () ) = ( V 1 (),V 2 () ) /8, and
hence ( S 1 (),S 2 () ) = ( V 1 (),V 2 () ) /4. ^ ~ ^ ~~ ~
- Slide 46
- Property Testing MC Armed with the Max-Cut approx algorithm, we
have in hand a property tester for the class MC . Graphs with cut
density : MC = {G : (G) } For example, for = , MC 1/4 is the group
of graphs that contain a cut of at least density . Note that for
> testing for -cut is trivial. (why?) For every constant 0 >
, there exists a property testing algorithm for MC .
- Slide 47
- Property Testing MC - Proof Let and let The testing algorithm
runs the Max-Cut approx algorithm algorithm shown earlier, with and
as input. Graph G is accepted if and only if. If (G) , then by
Max-Cut approx algorithm, G is accepted with pr 1 . If G is
accepted with pr > , then (G) - 2. This implies that G is -close
to some G in MC .
- Slide 48
- Property Testing MC - Proof Let (V 1,V 2 ) be a partition of
V(G) such that (V 1,V 2 ) - 2. => 2|V 1 |*|V 2 | ( - 2)N 2 If
2|V 1 |*|V 2 | N 2 : To obtain G we simply add edges between
vertices in V 1 and vertices in V 2 until (V 1,V 2 ) = (why can we
do this?) In this case, dist(G,G) 2 < .
- Slide 49
- Property Testing MC - Proof Else, meaning 2|V 1 |*|V 2 | < N
2 : We cannot obtain G by simply adding more edges (why?). Instead,
we will move vertices from the larger set (assume V 1 ) to the
smaller set (assume V 2 ). Then we will have enough room for the
extra edges.
- Slide 50
- Property Testing MC - Proof Assume |V 1 ||V 1 |-|V 1 | (/)*N
=> And now we can proceed adding edges between V 1 and V 2 until
we reach the cut density required.
- Slide 51
- The End.