Kevin Wayne Princeton University - Computer Science Department at
Princeton University COS 423 Theory of Algorithms Spring 2001 Kevin Wayne Linear Programming Some of...
-
date post
15-Jan-2016 -
Category
Documents
-
view
220 -
download
0
Transcript of Princeton University COS 423 Theory of Algorithms Spring 2001 Kevin Wayne Linear Programming Some of...
![Page 1: Princeton University COS 423 Theory of Algorithms Spring 2001 Kevin Wayne Linear Programming Some of these slides are adapted from The Allocation of Resources.](https://reader036.fdocuments.net/reader036/viewer/2022062322/56649d5f5503460f94a3f9d3/html5/thumbnails/1.jpg)
Princeton University • COS 423 • Theory of Algorithms • Spring 2001 • Kevin Wayne
Linear Programming
Some of these slides are adapted from The Allocation of Resources by Linear Programming by Bob Bland in Scientific American.
![Page 2: Princeton University COS 423 Theory of Algorithms Spring 2001 Kevin Wayne Linear Programming Some of these slides are adapted from The Allocation of Resources.](https://reader036.fdocuments.net/reader036/viewer/2022062322/56649d5f5503460f94a3f9d3/html5/thumbnails/2.jpg)
2
Linear Programming
Significance. Quintessential tool for optimal allocation of scarce resources,
among a number of competing activities. Powerful model generalizes many classic problems:
– shortest path, max flow, multicommodity flow, MST, matching, 2-person zero sum games
Ranked among most important scientific advances of 20th century.– accounts for a major proportion of all scientific computation
Helps find "good" solutions to NP-hard optimization problems.– optimal solutions (branch-and-cut)– provably good solutions (randomized rounding)
![Page 3: Princeton University COS 423 Theory of Algorithms Spring 2001 Kevin Wayne Linear Programming Some of these slides are adapted from The Allocation of Resources.](https://reader036.fdocuments.net/reader036/viewer/2022062322/56649d5f5503460f94a3f9d3/html5/thumbnails/3.jpg)
3
Brewery Problem
Small brewery produces ale and beer. Production limited by scarce resources: corn, hops, barley malt. Recipes for ale and beer require different proportions of resources.
How can brewer maximize profits? Devote all resources to beer: 32 barrels of beer $736. Devote all resources to ale: 34 barrels of ale $442. 7½ barrels of ale, 29½ barrels of beer $776. 12 barrels of ale, 28 barrels of beer $800.
BeverageCorn
(pounds)Malt
(pounds)Hops
(ounces)
Beer 15 204
Ale 5 354
Profit($)
23
13
Quantity 480 1190160
![Page 4: Princeton University COS 423 Theory of Algorithms Spring 2001 Kevin Wayne Linear Programming Some of these slides are adapted from The Allocation of Resources.](https://reader036.fdocuments.net/reader036/viewer/2022062322/56649d5f5503460f94a3f9d3/html5/thumbnails/4.jpg)
4
Brewery Problem
0,
11902035
16044
480155t. s.
2313max
BA
BA
BA
BA
BA
![Page 5: Princeton University COS 423 Theory of Algorithms Spring 2001 Kevin Wayne Linear Programming Some of these slides are adapted from The Allocation of Resources.](https://reader036.fdocuments.net/reader036/viewer/2022062322/56649d5f5503460f94a3f9d3/html5/thumbnails/5.jpg)
5
Brewery Problem: Feasible Region
Ale
Beer
(34, 0)
(0, 32)
Corn5A + 15B 480
(12, 28)
(26, 14)
(0, 0)
Hops4A + 4B 160
Malt35A + 20B 1190
![Page 6: Princeton University COS 423 Theory of Algorithms Spring 2001 Kevin Wayne Linear Programming Some of these slides are adapted from The Allocation of Resources.](https://reader036.fdocuments.net/reader036/viewer/2022062322/56649d5f5503460f94a3f9d3/html5/thumbnails/6.jpg)
6
Brewery Problem: Objective Function
Ale
Beer
(34, 0)
(0, 32)
(12, 28)
13A + 23B = 800
13A + 23B = 1600
13A + 23B = 442
(26, 14)
Profit
(0, 0)
![Page 7: Princeton University COS 423 Theory of Algorithms Spring 2001 Kevin Wayne Linear Programming Some of these slides are adapted from The Allocation of Resources.](https://reader036.fdocuments.net/reader036/viewer/2022062322/56649d5f5503460f94a3f9d3/html5/thumbnails/7.jpg)
7
Ale
Beer
(34, 0)
(0, 32)
(12, 28)
(26, 14)
(0, 0)
Extreme points
Brewery Problem: Geometry
Brewery problem observation. Regardless of coefficients of linear objective function, there exists an optimal solution that is an extreme point.
![Page 8: Princeton University COS 423 Theory of Algorithms Spring 2001 Kevin Wayne Linear Programming Some of these slides are adapted from The Allocation of Resources.](https://reader036.fdocuments.net/reader036/viewer/2022062322/56649d5f5503460f94a3f9d3/html5/thumbnails/8.jpg)
8
Linear Programming
LP "standard" form. Input data: rational numbers cj, bi, aij .
Maximize linear objective function. Subject to linear inequalities.
njx
mibxa
xc
j
i
n
jjij
n
jjj
10
1t. s.
max(P)
1
1
0
t. s.
max(P)
x
bAx
xc
![Page 9: Princeton University COS 423 Theory of Algorithms Spring 2001 Kevin Wayne Linear Programming Some of these slides are adapted from The Allocation of Resources.](https://reader036.fdocuments.net/reader036/viewer/2022062322/56649d5f5503460f94a3f9d3/html5/thumbnails/9.jpg)
9
LP: Geometry
Geometry. Forms an n-dimensional
polyhedron.
Convex: if y and z are feasible solutions, then so is ½y + ½z. Extreme point: feasible solution x that can't be written as ½y + ½z
for any two distinct feasible solutions y and z.
njx
mibxa
xc
j
i
n
jjij
n
jjj
10
1t. s.
max(P)
1
1
z
y
z
y
Convex Not convex
Extreme points
![Page 10: Princeton University COS 423 Theory of Algorithms Spring 2001 Kevin Wayne Linear Programming Some of these slides are adapted from The Allocation of Resources.](https://reader036.fdocuments.net/reader036/viewer/2022062322/56649d5f5503460f94a3f9d3/html5/thumbnails/10.jpg)
10
LP: Geometry
Extreme Point Theorem. If there exists an optimal solution to standard form LP (P), then there exists one that is an extreme point.
Only need to consider finitely many possible solutions.
Greed. Local optima areglobal optima.
![Page 11: Princeton University COS 423 Theory of Algorithms Spring 2001 Kevin Wayne Linear Programming Some of these slides are adapted from The Allocation of Resources.](https://reader036.fdocuments.net/reader036/viewer/2022062322/56649d5f5503460f94a3f9d3/html5/thumbnails/11.jpg)
11
LP: Algorithms
Simplex. (Dantzig 1947) Developed shortly after WWII in response to logistical problems:
used for 1948 Berlin airlift. Practical solution method that moves from one extreme point to a
neighboring extreme point. Finite (exponential) complexity, but no polynomial implementation
known.
![Page 12: Princeton University COS 423 Theory of Algorithms Spring 2001 Kevin Wayne Linear Programming Some of these slides are adapted from The Allocation of Resources.](https://reader036.fdocuments.net/reader036/viewer/2022062322/56649d5f5503460f94a3f9d3/html5/thumbnails/12.jpg)
12
LP: Polynomial Algorithms
Ellipsoid. (Khachian 1979, 1980) Solvable in polynomial time: O(n4 L) bit operations.
– n = # variables – L = # bits in input
Theoretical tour de force. Not remotely practical.
Karmarkar's algorithm. (Karmarkar 1984) O(n3.5 L). Polynomial and reasonably efficient
implementations possible.
Interior point algorithms. O(n3 L). Competitive with simplex!
– will likely dominate on large problems soon Extends to even more general problems.
![Page 13: Princeton University COS 423 Theory of Algorithms Spring 2001 Kevin Wayne Linear Programming Some of these slides are adapted from The Allocation of Resources.](https://reader036.fdocuments.net/reader036/viewer/2022062322/56649d5f5503460f94a3f9d3/html5/thumbnails/13.jpg)
13
LP Duality
Primal problem.
Find a lower bound on optimal value. (x1, x2, x3, x4) = (0, 0, 1, 0) z* 5.
(x1, x2, x3, x4) = (2, 1, 1, 1/3) z* 15.
(x1, x2, x3, x4) = (3, 0, 2, 0) z* 22.
(x1, x2, x3, x4) = (0, 14, 0, 5) z* 29.
0,,,
3532
55835
13t. s.
354max
4321
4321
4321
4321
4321
xxxx
xxxx
xxxx
xxxx
xxxx
![Page 14: Princeton University COS 423 Theory of Algorithms Spring 2001 Kevin Wayne Linear Programming Some of these slides are adapted from The Allocation of Resources.](https://reader036.fdocuments.net/reader036/viewer/2022062322/56649d5f5503460f94a3f9d3/html5/thumbnails/14.jpg)
14
LP Duality
Primal problem.
Find an upper bound on optimal value. Multiply 2nd inequality by 2: 10x1 + 2x2 + 6x3 + 16x4 110.
z* = 4x1 + x2 + 5x3 + 3x4 10x1 + 2x2 + 6x3 + 16x4 110.
Adding 2nd and 3rd inequalities: 4x1 + 3x2 + 6x3 + 3x4 58.
z* = 4x1 + x2 + 5x3 + 3x4 4x1 + 3x2 + 6x3 + 3x4 58.
0,,,
3532
55835
13t. s.
354max
4321
4321
4321
4321
4321
xxxx
xxxx
xxxx
xxxx
xxxx
![Page 15: Princeton University COS 423 Theory of Algorithms Spring 2001 Kevin Wayne Linear Programming Some of these slides are adapted from The Allocation of Resources.](https://reader036.fdocuments.net/reader036/viewer/2022062322/56649d5f5503460f94a3f9d3/html5/thumbnails/15.jpg)
15
LP Duality
Primal problem.
Find an upper bound on optimal value. Adding 11 times 1st inequality to 6 times 3rd inequality:
z* = 4x1 + x2 + 5x3 + 3x4 5x1 + x2 + 7x3 + 3x4 29.
Recall. (x1, x2, x3, x4) = (0, 14, 0, 5) z* 29.
0,,,
3532
55835
13t. s.
354max
4321
4321
4321
4321
4321
xxxx
xxxx
xxxx
xxxx
xxxx
![Page 16: Princeton University COS 423 Theory of Algorithms Spring 2001 Kevin Wayne Linear Programming Some of these slides are adapted from The Allocation of Resources.](https://reader036.fdocuments.net/reader036/viewer/2022062322/56649d5f5503460f94a3f9d3/html5/thumbnails/16.jpg)
16
LP Duality
Primal problem.
General idea: add linear combination (y1, y2, y3) of the constraints.
Dual problem.
0,,,
3532
55835
13t. s.
354max
4321
4321
4321
4321
4321
xxxx
xxxx
xxxx
xxxx
xxxx
32143213321
23211321
355)583()33(
)2()5(
yyyxyyyxyyy
xyyyxyyy
0,,
3583
533
12
45t. s.
355min
321
321
321
321
321
321
yyy
yyy
yyy
yyy
yyy
yyy
![Page 17: Princeton University COS 423 Theory of Algorithms Spring 2001 Kevin Wayne Linear Programming Some of these slides are adapted from The Allocation of Resources.](https://reader036.fdocuments.net/reader036/viewer/2022062322/56649d5f5503460f94a3f9d3/html5/thumbnails/17.jpg)
17
njx
mibxa
xc
j
i
n
jjij
n
jjj
10
1t. s.
max(P)
1
1
miy
njcya
yb
i
j
m
iiij
m
iii
10
1t. s.
min(D)
1
1
LP Duality
Primal and dual linear programs: given rational numbers aij, bi, cj, find values xi, yj that optimize (P) and (D).
Duality Theorem (Gale-Kuhn-Tucker 1951, Dantzig-von Neumann 1947). If (P) and (D) are nonempty then max = min.
Dual solution provides certificate of optimality decision version NP co-NP.
Special case: max-flow min-cut theorem. Sensitivity analysis.
![Page 18: Princeton University COS 423 Theory of Algorithms Spring 2001 Kevin Wayne Linear Programming Some of these slides are adapted from The Allocation of Resources.](https://reader036.fdocuments.net/reader036/viewer/2022062322/56649d5f5503460f94a3f9d3/html5/thumbnails/18.jpg)
18
LP Duality: Economic Interpretation
Brewer's problem: find optimal mix of beer and ale to maximize profits.
Entrepreneur's problem: Buy individual resources from brewer at minimum cost.
C, H, M = unit price for corn, hops, malt. Brewer won't agree to sell resources if 5C + 4H + 35M < 13.
0,
11902035
16044
480155t. s.
2313max(P)
BA
BA
BA
BA
BA
0,,
2320415
133545t. s.
1190160480min(D)
MHC
MHC
MHC
MHC
A* = 12B* = 28 OPT = 800
C* = 1H* = 2 M* = 0OPT = 800
![Page 19: Princeton University COS 423 Theory of Algorithms Spring 2001 Kevin Wayne Linear Programming Some of these slides are adapted from The Allocation of Resources.](https://reader036.fdocuments.net/reader036/viewer/2022062322/56649d5f5503460f94a3f9d3/html5/thumbnails/19.jpg)
19
LP Duality: Economic Interpretation
Sensitivity analysis. How much should brewer be willing to pay (marginal price) for
additional supplies of scarce resources? corn $1, hops $2, malt $0.
Suppose a new product "light beer" is proposed. It requires 2 corn, 5 hops, 24 malt. How much profit must be obtained from light beer to justify diverting resources from production of beer and ale?
At least 2 ($1) + 5 ($2) + 24 (0$) = $12 / barrel.
![Page 20: Princeton University COS 423 Theory of Algorithms Spring 2001 Kevin Wayne Linear Programming Some of these slides are adapted from The Allocation of Resources.](https://reader036.fdocuments.net/reader036/viewer/2022062322/56649d5f5503460f94a3f9d3/html5/thumbnails/20.jpg)
20
Standard Form
Standard form.
Easy to handle variants. x + 2y - 3z 17 -x - 2y + 3z -17. x + 2y - 3z = 17 x + 2y - 3z 17, -x - 2y + 3z -17. min x + 2y - 3z max -x - 2y + 3z. x unrestricted x = y - z, y 0, z 0.
njx
mibxa
xc
j
i
n
jjij
n
jjj
10
1t. s.
max(P)
1
1
![Page 21: Princeton University COS 423 Theory of Algorithms Spring 2001 Kevin Wayne Linear Programming Some of these slides are adapted from The Allocation of Resources.](https://reader036.fdocuments.net/reader036/viewer/2022062322/56649d5f5503460f94a3f9d3/html5/thumbnails/21.jpg)
21
LP Application: Weighted Bipartite Matching
Assignment problem. Given a complete bipartite network Kn,n and edge
weights cij, find a perfect matching of minimum weight.
Birkhoff-von Neumann theorem (1946, 1953). All extreme points of the above polyhedron are {0-1}-valued.
Corollary. Can solve assignment problem using LP techniques since LP algorithms return optimal solution that is an extreme point.
Remark. Polynomial combinatorial algorithms also exist.
njix
njx
nix
xc
ij
niij
njij
njijij
ni
,10
11
11t. s.
min
1
1
11
![Page 22: Princeton University COS 423 Theory of Algorithms Spring 2001 Kevin Wayne Linear Programming Some of these slides are adapted from The Allocation of Resources.](https://reader036.fdocuments.net/reader036/viewer/2022062322/56649d5f5503460f94a3f9d3/html5/thumbnails/22.jpg)
22
LP Application: Weighted Bipartite Matching
Birkhoff-von Neumann theorem (1946, 1953). All extreme points of the above polytope are {0-1}-valued.
Proof (by contradiction). Suppose x is a fractional feasible solution. Consider A = { (i, j) : 0 < xij }.
3
6
10
7
A
E
H
B
D I
C
F
J
G
01
¼
½¼ ½0 1
![Page 23: Princeton University COS 423 Theory of Algorithms Spring 2001 Kevin Wayne Linear Programming Some of these slides are adapted from The Allocation of Resources.](https://reader036.fdocuments.net/reader036/viewer/2022062322/56649d5f5503460f94a3f9d3/html5/thumbnails/23.jpg)
23
LP Application: Weighted Bipartite Matching
Birkhoff-von Neumann theorem (1946, 1953). All extreme points of the above polytope are {0-1}-valued.
Proof (by contradiction). Suppose x is a fractional feasible solution. Consider A = { (i, j) : 0 < xij }.
Claim: there exists a perfectmatching in (V, A).
– fractional flow gives fractionalperfect matching
3
6
10
7
A
E
H
B
D I
C
F
J
G
¼
½½
¼½¼
1
![Page 24: Princeton University COS 423 Theory of Algorithms Spring 2001 Kevin Wayne Linear Programming Some of these slides are adapted from The Allocation of Resources.](https://reader036.fdocuments.net/reader036/viewer/2022062322/56649d5f5503460f94a3f9d3/html5/thumbnails/24.jpg)
24
LP Application: Weighted Bipartite Matching
Birkhoff-von Neumann theorem (1946, 1953). All extreme points of the above polytope are {0-1}-valued.
Proof (by contradiction). Suppose x is a fractional feasible solution. Consider A = { (i, j) : 0 < xij }.
Claim: there exists a perfectmatching in (V, A).
– fractional flow gives fractionalperfect matching
– apply integrality theoremfor max flow
Define = min { xij : xij > 0},
x1 = (1 - ) x + y,x2 = (1+ ) x - y.
x = ½ x1 + ½ x2 x not an extreme point.
3
6
10
7
A
E
H
B
D I
C
F
J
G
yAF = 1
yBG = 1
yEJ = 1
![Page 25: Princeton University COS 423 Theory of Algorithms Spring 2001 Kevin Wayne Linear Programming Some of these slides are adapted from The Allocation of Resources.](https://reader036.fdocuments.net/reader036/viewer/2022062322/56649d5f5503460f94a3f9d3/html5/thumbnails/25.jpg)
25
LP Application: Multicommodity Flow
Multicommodity flow. Given a network G = (V, E) with edge capacities u(e) 0, edge costs c(e) 0, set of commodities K, and supply / demand dk(v) for commodity k at node v, find a minimum cost flow that satisfies all of the demand.
Applications. Transportation networks. Communication networks (Akamai). Solving Ax =b with Gaussian elimination, preserving sparsity. VLSI design.
KkEeex
Eeeuex
KkVvvdexex
exec
kEe
k
Kk
k
ve
k
ve
kEe
kk
Kk
,0)(
)()(
,)()()(t. s.
)()(min
ofout to in
![Page 26: Princeton University COS 423 Theory of Algorithms Spring 2001 Kevin Wayne Linear Programming Some of these slides are adapted from The Allocation of Resources.](https://reader036.fdocuments.net/reader036/viewer/2022062322/56649d5f5503460f94a3f9d3/html5/thumbnails/26.jpg)
26
Weighted Vertex Cover
Weighted vertex cover. Given an undirected graph G = (V, E) with vertex weights wv 0, find a minimum weight subset of nodes S such
that every edge is incident to at least one vertex in S. NP-hard even if all weights are 1.
Integer programming formulation.
If x* is optimal solution to (ILP),then S = {v V : x*v = 1} is a min
weight vertex cover.
3
6
10
7
A
E
H
B
D I
C
F
J
G
6
16
10
7
23
9
10
9
33
Total weight = 55.
32
Vvx
Ewvxx
xwILP
v
wv
Vvvv
}1,0{
),(1t. s.
min)(
![Page 27: Princeton University COS 423 Theory of Algorithms Spring 2001 Kevin Wayne Linear Programming Some of these slides are adapted from The Allocation of Resources.](https://reader036.fdocuments.net/reader036/viewer/2022062322/56649d5f5503460f94a3f9d3/html5/thumbnails/27.jpg)
27
Integer Programming
INTEGER-PROGRAMMING: Given rational numbers aij, bi, cj, find
integers xj that solve:
Claim. INTEGER-PROGRAMMING is NP-hard.
Proof. VERTEX-COVER P INTEGER-PROGRAMMING.
njx
njx
mibxa
xc
j
j
i
n
jjij
n
jjj
1integral
10
1t. s.
min
1
1
Vvx
Vvx
Ewvxx
x
v
v
wv
Vvv
integral
0
),(1t. s.
min
![Page 28: Princeton University COS 423 Theory of Algorithms Spring 2001 Kevin Wayne Linear Programming Some of these slides are adapted from The Allocation of Resources.](https://reader036.fdocuments.net/reader036/viewer/2022062322/56649d5f5503460f94a3f9d3/html5/thumbnails/28.jpg)
28
Weighted Vertex Cover
Linear programming relaxation.
Note: optimal value of (LP) is optimal value of (ILP), and may be strictly less.
– clique on n nodes requires n-1nodes in vertex cover
– LP solution x* = ½ has value n / 2
Provides lower bound for approximation algorithm. How can solving LP help us find good vertex cover?
Round fractional values.
Vvx
Ewvxx
xw
v
wv
Vvvv
0
),(1t. s.
min(LP)
11
1
![Page 29: Princeton University COS 423 Theory of Algorithms Spring 2001 Kevin Wayne Linear Programming Some of these slides are adapted from The Allocation of Resources.](https://reader036.fdocuments.net/reader036/viewer/2022062322/56649d5f5503460f94a3f9d3/html5/thumbnails/29.jpg)
29
Weighted Vertex Cover
Theorem. If x* is optimal solution to (LP), then S = {v V : x*v ½} is
a vertex cover within a factor of 2 of the best possible. Provides 2-approximation algorithm. Solve LP, and then round.
S is a vertex cover. Consider an edge (v,w) E. Since x*v + x*w 1, either x*v or x*w ½ (v,w) covered.
S has small cost. Let S* be optimal vertex cover.
Svv
Svvv
Vvvv
Svv
w
xw
xww
21
*
*
*LP is relaxation
x*v ½
![Page 30: Princeton University COS 423 Theory of Algorithms Spring 2001 Kevin Wayne Linear Programming Some of these slides are adapted from The Allocation of Resources.](https://reader036.fdocuments.net/reader036/viewer/2022062322/56649d5f5503460f94a3f9d3/html5/thumbnails/30.jpg)
30
Weighted Vertex Cover
Good news. 2-approximation algorithm is basis for most practical heuristics.
– can solve LP with min cut faster– primal-dual schema linear time 2-approximation
PTAS for planar graphs. Solvable on bipartite graphs using network flow.
Bad news. NP-hard even on 3-regular planar graphs with unit weights. If P NP, then no -approximation for < 4/3, even with unit
weights. (Dinur-Safra, 2001)
![Page 31: Princeton University COS 423 Theory of Algorithms Spring 2001 Kevin Wayne Linear Programming Some of these slides are adapted from The Allocation of Resources.](https://reader036.fdocuments.net/reader036/viewer/2022062322/56649d5f5503460f94a3f9d3/html5/thumbnails/31.jpg)
31
Maximum Satisfiability
MAX-SAT: Given clauses C1, C2, . . . Cm in CNF over Boolean variables
x1, x2, . . . xn, and integer weights wj 0 for each clause, find a truth
assignment for the xi that maximizes the total weight of clauses
satisfied. NP-hard even if all weights are 1. Ex.
5
4
3
2
1
5215
4214
33213
2322
1321
wxxC
wxxC
wxxxC
wxxC
wxxCx1 = 0x2 = 0x3 = 1
weight = 14
![Page 32: Princeton University COS 423 Theory of Algorithms Spring 2001 Kevin Wayne Linear Programming Some of these slides are adapted from The Allocation of Resources.](https://reader036.fdocuments.net/reader036/viewer/2022062322/56649d5f5503460f94a3f9d3/html5/thumbnails/32.jpg)
32
Maximum Satisfiability: Johnson's Algorithm
Randomized polynomial time (RP). Polynomial time extended to allow calls to random() call in unit time.
Polynomial algorithm A with one-sided error:– if x is YES instance: Pr[A(x) = YES] ½– if x is NO instance: Pr[A(x) = YES] = 0
Fundamental open question: does P = RP?
Johnson's Algorithm: Flip a coin, and set each variable true with probability ½, independently for each variable.
Theorem: The "dumb" algorithm is a 2-approximation for MAX-SAT.
![Page 33: Princeton University COS 423 Theory of Algorithms Spring 2001 Kevin Wayne Linear Programming Some of these slides are adapted from The Allocation of Resources.](https://reader036.fdocuments.net/reader036/viewer/2022062322/56649d5f5503460f94a3f9d3/html5/thumbnails/33.jpg)
33
Maximum Satisfiability: Johnson's Algorithm
Proof: Consider random variable
Let
Let OPT = weight of the optimal assignment. Let j be the number of distinct literals in clause Cj.
otherwise.0
satisfied is clause if1 jj
CY
.1
m
jjj YwW
.
))(1(
]satisfied is clausePr[
][][
21
121
121
1
1
OPT
w
w
Cw
YEwWE
m
jj
m
jj
m
jjj
m
jjj
j
linearity of expectation
weights are 0
![Page 34: Princeton University COS 423 Theory of Algorithms Spring 2001 Kevin Wayne Linear Programming Some of these slides are adapted from The Allocation of Resources.](https://reader036.fdocuments.net/reader036/viewer/2022062322/56649d5f5503460f94a3f9d3/html5/thumbnails/34.jpg)
34
Maximum Satisfiability: Johnson's Algorithm
Corollary. If every clause has at least k literals, Johnson's algorithm is a 1 / (1 - (½)k) approximation algorithm.
8/7 approximation algorithm for MAX E3SAT.
Theorem (Håstad, 1997). If MAX ESAT has an -approximation for < 8/7, then P = NP.
Johnson's algorithm is best possible in some cases.
![Page 35: Princeton University COS 423 Theory of Algorithms Spring 2001 Kevin Wayne Linear Programming Some of these slides are adapted from The Allocation of Resources.](https://reader036.fdocuments.net/reader036/viewer/2022062322/56649d5f5503460f94a3f9d3/html5/thumbnails/35.jpg)
35
Maximum Satisfiability: Randomized Rounding
Idea 1. Used biased coin flips, not 50-50.
Idea 2. Solve linear program to determine coin biases.
Pj = indices of variables that occur un-negated in clause Cj.
Nj = indices of variables that occur negated in clause Cj.
Theorem (Goemans-Williamson, 1994). The algorithm is ane / (e-1) 1.582-approximation algorithm.
otherwise.0
satisfied is clause if1 jj
Cz
otherwise.0
true is x1 iiy
10
)1(t. s.
max(LP)
j
jNi
iPi
i
jjj
z
zyy
zw
jj
![Page 36: Princeton University COS 423 Theory of Algorithms Spring 2001 Kevin Wayne Linear Programming Some of these slides are adapted from The Allocation of Resources.](https://reader036.fdocuments.net/reader036/viewer/2022062322/56649d5f5503460f94a3f9d3/html5/thumbnails/36.jpg)
36
Maximum Satisfiability: Randomized Rounding
Fact 1. For any nonnegative a1, . . . ak, the geometric mean is the
arithmetic mean.
Theorem (Goemans-Williamson, 1994). The algorithm is ane / (e-1) 1.582-approximation algorithm for MAX-SAT.
Proof. Consider an arbitrary clause Cj.
)( 211
21 kkk aaaaaak
j
j
j
jjj
jjj
j j
z
j
Nii
Pii
j
Nii
Pii
Pi Niiij
yy
yy
yyC
*
1
1(1
)1(
)1(satisfied]not is clausePr[
)(
)(
**1
**1
**
geometric-arithmetic mean
LP constraint
![Page 37: Princeton University COS 423 Theory of Algorithms Spring 2001 Kevin Wayne Linear Programming Some of these slides are adapted from The Allocation of Resources.](https://reader036.fdocuments.net/reader036/viewer/2022062322/56649d5f5503460f94a3f9d3/html5/thumbnails/37.jpg)
37
Maximum Satisfiability: Randomized Rounding
Proof (continued).
*1
*1
)1(
11
11satisfied] is clausePr[
)(
)(*
je
j
zj
z
z
C
j
j
j
j
j
(1-1/x)x converges to e-1
(0, 0)
)11,1( )( 1
*z
*11*)( )( 1 zzf
)( *11*)( zzf
1 - (1 - z* / ) is concave
*)(zf
![Page 38: Princeton University COS 423 Theory of Algorithms Spring 2001 Kevin Wayne Linear Programming Some of these slides are adapted from The Allocation of Resources.](https://reader036.fdocuments.net/reader036/viewer/2022062322/56649d5f5503460f94a3f9d3/html5/thumbnails/38.jpg)
38
Maximum Satisfiability: Randomized Rounding
Proof (continued). From previous slide:
Let W = weight of clauses that are satisfied.
Corollary. If all clauses have length at most k
*1 )1(satisfied] is clausePr[ jej zC
OPT
OPT
zw
CwWE
e
LPe
m
jjje
m
jjj
)1(
)1(
)1(
]satisfied is clausePr[][
1
1
1
*1
1
.])1(1[][ 1 OPTWE kk
![Page 39: Princeton University COS 423 Theory of Algorithms Spring 2001 Kevin Wayne Linear Programming Some of these slides are adapted from The Allocation of Resources.](https://reader036.fdocuments.net/reader036/viewer/2022062322/56649d5f5503460f94a3f9d3/html5/thumbnails/39.jpg)
39
Maximum Satisfiability: Best of Two
Observation. Two approximation algorithms are complementary. Johnson's algorithm works best when clauses are long. LP rounding algorithm works best when clauses are short.
How can we exploit this? Run both algorithms and output better of two. Re-analyze to get 4/3-approximation algorithm. Better performance than either algorithm individually!
(x1, W1) Johnson(C1,…,Cm)(x2, W2) LPround(C1,…,Cm)
IF (W1 > W2) RETURN x1
ELSE RETURN x2
Best-of-Two (C1, C2,..., Cm)
![Page 40: Princeton University COS 423 Theory of Algorithms Spring 2001 Kevin Wayne Linear Programming Some of these slides are adapted from The Allocation of Resources.](https://reader036.fdocuments.net/reader036/viewer/2022062322/56649d5f5503460f94a3f9d3/html5/thumbnails/40.jpg)
40
Maximum Satisfiability: Best of Two
Theorem (Goemans-Williamson, 1994). The Best-of-Two algorithm is a 4/3-approximation algorithm for MAX-SAT.
Proof.
.
1)1(
2] Algby satisfied is clausePr[
1] Algby satisfied is clausePr[
),max(
4343
*23
21
*121
21
21
21
2211
2121
)1()(
OPT
OPT
zw
zw
Cw
Cw
WWEWWE
LP
jjj
jjj
jjj
jjj
j
j
j
next slide
![Page 41: Princeton University COS 423 Theory of Algorithms Spring 2001 Kevin Wayne Linear Programming Some of these slides are adapted from The Allocation of Resources.](https://reader036.fdocuments.net/reader036/viewer/2022062322/56649d5f5503460f94a3f9d3/html5/thumbnails/41.jpg)
41
Maximum Satisfiability: Best of Two
Lemma. For any integer 1,
Proof. Case 1 ( = 1):
Case 2 ( = 2):
Case 3 ( 3):
.1)1( *23*1
21 )1()( jj zz
.1 *23*
21
jj zz
.*23*
43
43
jj zz
.
1()1(1)1(
*23
*85
87
*1321*1
21 ))()1()(
j
j
jej
z
z
zz
![Page 42: Princeton University COS 423 Theory of Algorithms Spring 2001 Kevin Wayne Linear Programming Some of these slides are adapted from The Allocation of Resources.](https://reader036.fdocuments.net/reader036/viewer/2022062322/56649d5f5503460f94a3f9d3/html5/thumbnails/42.jpg)
42
Maximum Satisfiability: State of the Art
Observation. Can't get better than 4/3-approximation using our LP. If all weights = 1, OPTLP = 4 but OPT = 3.
Lower bound. Unless P = NP, can't do better than 8 / 7 1.142.
Semi-definite programming. 1.275-approximation algorithm. 1.2-approximation algorithm
if certain conjecture is true.
Open research problem. 4 / 3 - approximation algorithm
without solving LP or SDP.
214
213
212
211
xxC
xxC
xxC
xxC
0
t. s.
max(SDP)
X
bXA
XC
ii
C, X, B, Ai SR(n n)
positive semi-definite
ij
n
jij
n
iyxYX
11
![Page 43: Princeton University COS 423 Theory of Algorithms Spring 2001 Kevin Wayne Linear Programming Some of these slides are adapted from The Allocation of Resources.](https://reader036.fdocuments.net/reader036/viewer/2022062322/56649d5f5503460f94a3f9d3/html5/thumbnails/43.jpg)
Princeton University • COS 423 • Theory of Algorithms • Spring 2001 • Kevin Wayne
Appendix: Proof of LP Duality Theorem
0
t. s.
min(D)
y
cAy
byT
T
0
t. s.
max(P)
x
bAx
xcT
LP Duality Theorem. For A mn , b m , c n, if (P) and (D) are nonempty then max = min.
![Page 44: Princeton University COS 423 Theory of Algorithms Spring 2001 Kevin Wayne Linear Programming Some of these slides are adapted from The Allocation of Resources.](https://reader036.fdocuments.net/reader036/viewer/2022062322/56649d5f5503460f94a3f9d3/html5/thumbnails/44.jpg)
44
LP Weak Duality
LP Weak Duality. For A mn , b m , c n, if (P) and (D) are nonempty, then max min.
Proof (easy). Suppose x m is feasible for (P) and y n is feasible for (D).
– x 0, yTA c yTAx cTx.– y 0, Ax b yTAx yTb– combining two inequalities: cTx yTAx yTb
0
t. s.
max(P)
x
bAx
xcT
0
t. s.
min(D)
y
cAy
byT
T
![Page 45: Princeton University COS 423 Theory of Algorithms Spring 2001 Kevin Wayne Linear Programming Some of these slides are adapted from The Allocation of Resources.](https://reader036.fdocuments.net/reader036/viewer/2022062322/56649d5f5503460f94a3f9d3/html5/thumbnails/45.jpg)
45
Closest Point
X
y
x*
x'
x
X'
Weierstrass' Theorem. Let X be a compact set, and let f(x) be a continuous function on X. Then min {f(x) : x X} exists.
Lemma 1. Let X m be a nonempty closed convex set, and let y X. Then there exists x* X with minimum distance from y. Moreover, for all
x X we have (y – x*)T (x – x*) 0.
Proof. (existence) Define f(x) = ||y - x||. Want to apply Weierstrass:
– f is continuous– X closed, but maybe not bounded
X there exists x' X. X' = {x X : ||y – x|| ||y – x'|| }
is closed and bounded. min {f(x) : x X} = min {f(x) : x X'}
![Page 46: Princeton University COS 423 Theory of Algorithms Spring 2001 Kevin Wayne Linear Programming Some of these slides are adapted from The Allocation of Resources.](https://reader036.fdocuments.net/reader036/viewer/2022062322/56649d5f5503460f94a3f9d3/html5/thumbnails/46.jpg)
46
Closest Point
Lemma 1. Let X m be a nonempty closed convex set, and let y X. Then there exists x* X with minimum distance from y. Moreover, for all
x X we have (y – x*)T (x – x*) 0.
Proof. (moreover) x* min distance ||y - x*||2 ||y – x||2 for all x X. By convexity: if x X, then x* + (x – x*) X for all 0 1. ||y – x*||2 ||y – x* – (x – x*) ||2
= ||y – x*||2 + 2 ||(x – x*)||2 – 2 (y – x*)T(x - x*)
Thus, (y – x*)T(x - x*) ½ ||(x – x*)||2. Letting 0+, we obtain the desired result.
![Page 47: Princeton University COS 423 Theory of Algorithms Spring 2001 Kevin Wayne Linear Programming Some of these slides are adapted from The Allocation of Resources.](https://reader036.fdocuments.net/reader036/viewer/2022062322/56649d5f5503460f94a3f9d3/html5/thumbnails/47.jpg)
47
Separating Hyperplane Theorem. Let X m be a nonempty closed convex set, and let y X. Then there exists a hyperplaneH = { x m : aTx = } where a m , that separates y from X.
aTx for all x X. aTy > .
Proof. Let x* be closest point in X to y.
– L1 (y – x*)T (x – x*) 0 for all x X Choose a = y – x* 0 and = aTx*.
– aTy = aT (a + x*) = ||a||2 + > – if x X, then aT(x – x*) 0
aTx aTx* =
Separating Hyperplane Theorem
H = { x m : aTx = }
X
y x*
x
![Page 48: Princeton University COS 423 Theory of Algorithms Spring 2001 Kevin Wayne Linear Programming Some of these slides are adapted from The Allocation of Resources.](https://reader036.fdocuments.net/reader036/viewer/2022062322/56649d5f5503460f94a3f9d3/html5/thumbnails/48.jpg)
48
Farkas' Theorem (Farkas 1894, Minkowski 1896). For A mn , b m exactly one of the following two systems holds.
Proof (not both). Suppose x satisfies (I) and y satisfies (II). Then 0 < yTb = yTAx 0, a contradiction.
Proof (at least one). Suppose (I) infeasible. We will show (II) feasible. Consider S = {Ax : x 0} so that S closed, convex, b S.
– there exists hyperplane y m , separating b from S:yTb > , yTs for all s S.
0 S 0 yTb > 0 yTAx for all x 0 yTA 0 since x can be arbitrarily large.
Fundamental Theorem of Linear Inequalities
0
t. s.
(I)
x
bAx
x n
0
0t. s.
(II)
by
Ay
y
T
T
m
![Page 49: Princeton University COS 423 Theory of Algorithms Spring 2001 Kevin Wayne Linear Programming Some of these slides are adapted from The Allocation of Resources.](https://reader036.fdocuments.net/reader036/viewer/2022062322/56649d5f5503460f94a3f9d3/html5/thumbnails/49.jpg)
49
Corollary. For A mn , b m exactly one of the following two systems holds.
Proof. Define A' m (m+n) = [A | I ], x' = [x | s ], where s m. Farkas' Theorem to A', b': exactly one of (I') and (II') is feasible.
(I') equivalent to (I), (II') equivalent to (II).
Another Theorem of the Alternative
0
t. s.
(I)
x
bAx
x n
0
0
0t. s.
(II)
y
by
Ay
y
T
T
m
0,
t. s.
,)(I'
sx
bIsAx
sx mn
0
0
0t. s.
)(II'
by
yI
Ay
y
T
T
m
![Page 50: Princeton University COS 423 Theory of Algorithms Spring 2001 Kevin Wayne Linear Programming Some of these slides are adapted from The Allocation of Resources.](https://reader036.fdocuments.net/reader036/viewer/2022062322/56649d5f5503460f94a3f9d3/html5/thumbnails/50.jpg)
50
LP Strong Duality
LP Duality Theorem. For A mn , b m , c n, if (P) and (D) are nonempty then max = min.
Proof (max min). Weak LP duality.
Proof (min max). Suppose max < . We show min < .
By definition of , (I) infeasible (II) feasible by Farkas Corollary.
0
t. s.
max(P)
x
bAx
xcT
0
t. s.
min(D)
y
cAy
byT
T
0
t. s.
(I)
x
xc
bAx
x
T
n
0,
0
0t. s.
,(II)
zy
zby
czAy
zy
T
T
m
![Page 51: Princeton University COS 423 Theory of Algorithms Spring 2001 Kevin Wayne Linear Programming Some of these slides are adapted from The Allocation of Resources.](https://reader036.fdocuments.net/reader036/viewer/2022062322/56649d5f5503460f94a3f9d3/html5/thumbnails/51.jpg)
51
LP Strong Duality
Let ŷ, z be a solution to (II).
Case 1: z = 0. Then, { y m : yTA 0, yTb < 0, y 0 } is feasible. Farkas Corollary { x n : Ax b, x 0 } is infeasible. Contradiction since by assumption (P) is nonempty.
Case 2: z > 0. Scale ŷ, z so that ŷ satisfies (II) and z = 1.
Resulting ŷ feasible to (D) and ŷTb < .
0,
0
0t. s.
,(II)
zy
zby
czAy
zy
T
T
m