Journal of Graph Algorithms and Applicationsreuven/PDF/LinerArrangement.pdf · Approximation...
Transcript of Journal of Graph Algorithms and Applicationsreuven/PDF/LinerArrangement.pdf · Approximation...
Journal of Graph Algorithms and Applicationshttp://www.cs.brown.edu/publications/jgaa/
vol. 5, no. 4, pp. 1–27 (2001)
Computing an optimal orientation of a balanced
decomposition tree for linear arrangementproblems
Reuven Bar-Yehuda
Computer Science Dept.Technion
Haifa, Israelhttp://www.cs.technion.ac.il/ reuven/
Guy Even
Dept. of Electrical Engineering-SystemsTel-Aviv University
Tel-Aviv, Israelhttp://www.eng.tau.ac.il/ guy/
Jon Feldman
Laboratory for Computer ScienceMassachusetts Institute of Technology
Cambridge, MA, USAhttp://theory.lcs.mit.edu/ jonfeld/
Joseph (Seffi) Naor
Computer Science Dept.Technion
Haifa, Israelhttp://www.cs.technion.ac.il/users/wwwb/cgi-bin/facultynew.cgi?Naor.Joseph
1
Abstract
Divide-and-conquer approximation algorithms for vertex orderingproblems partition the vertex set of graphs, compute recursively anordering of each part, and “glue” the orderings of the parts together.The computed ordering is specified by a decomposition tree that de-scribes the recursive partitioning of the subproblems. At each internalnode of the decomposition tree, there is a degree of freedom regardingthe order in which the parts are glued together.
Approximation algorithms that use this technique ignore these de-grees of freedom, and prove that the cost of every ordering that agreeswith the computed decomposition tree is within the range specifiedby the approximation factor. We address the question of whether anoptimal ordering can be efficiently computed among the exponentiallymany orderings induced by a binary decomposition tree.
We present a polynomial time algorithm for computing an opti-mal ordering induced by a binary balanced decomposition tree withrespect to two problems: Minimum Linear Arrangement (minla) andMinimum Cutwidth (mincw). For 1/3-balanced decomposition treesof bounded degree graphs, the time complexity of our algorithm isO(n2.2), where n denotes the number of vertices.
Additionally, we present experimental evidence that computing anoptimal orientation of a decomposition tree is useful in practice. Itis shown, through an implementation for minla, that optimal orien-tations of decomposition trees can produce arrangements of roughlythe same quality as those produced by the best known heuristic, at afraction of the running time.
Communicated by T. Warnow; submitted July 1998;revised September 2000 and June 2001.
Guy Even was supported in part by Intel Israel LTD and Intel Corp. under agrant awarded in 2000. Jon Feldman did part of this work while visiting Tel-AvivUniversity.
Bar-Yehuda et al., Orientation of Decomposition Trees, JGAA, 5(4) 1–27 (2001)3
1 Introduction
The typical setting in vertex ordering problems in graphs is to find a lin-ear ordering of the vertices of a graph that minimizes a certain objec-tive function [GJ79, A1.3,pp. 199-201]. These vertex ordering problemsarise in diverse areas such as: VLSI design [HL99], computational biol-ogy [K93], scheduling and constraint satisfaction problems [FD], and linearalgebra [R70]. Finding an optimal ordering is usually NP-hard and thereforeone resorts to polynomial time approximation algorithms.
Divide-and-conquer is a common approach underlying many approxima-tion algorithms for vertex ordering problems [BL84, LR99, Ha89, RAK91,ENRS00, RR98]. Such approximation algorithms partition the vertex setinto two or more sets, compute recursively an ordering of each part, and“glue” the orderings of the parts together. The computed ordering is speci-fied by a decomposition tree that describes the recursive partitioning of thesubproblems. At each internal node of the decomposition tree, there is adegree of freedom regarding the order in which the parts are glued together.We refer to determining the order of the parts as assigning an orientation toan internal decomposition tree node since, in the binary case, this is equiv-alent to deciding which child is the left child and which child is the rightchild. We refer to orderings that can be obtained by assigning orientationsto the internal nodes of the decomposition trees as orderings that agree withthe decomposition tree.
Approximation algorithms that use this technique ignore these degrees offreedom, and prove that the cost of every ordering that agrees with the com-puted decomposition tree is within the range specified by the approximationfactor. The questions that we address are whether an optimal ordering canbe efficiently computed among the orderings that agree with the decompo-sition tree computed by the divide-and-conquer approximation algorithms,and whether computing this optimal ordering is a useful technique in prac-tice.
Contribution. We present a polynomial time algorithm for computingan optimal orientation of a balanced binary decomposition tree with respectto two problems: Minimum Linear Arrangement (minla) and MinimumCutwidth (mincw). Loosely speaking, in these problems, the vertex or-dering determines the position of the graph vertices along a straight linewith fixed distances between adjacent vertices. In minla, the objective isto minimize the sum of the edge lengths, and in mincw, the objective isto minimize the maximum cut between a prefix and a suffix of the vertices.
Bar-Yehuda et al., Orientation of Decomposition Trees, JGAA, 5(4) 1–27 (2001)4
The corresponding decision problems for these optimization problems areNP-Complete [GJ79, problems GT42,GT44]. For binary 1/3-balanced de-composition trees of bounded degree graphs, the time complexity of ouralgorithm is O(n2.2), and the space complexity is O(n), where n denotes thenumber of vertices.
This algorithm also lends itself to a simple improvement heuristic forboth minla and mincw: Take some ordering π, build a balanced decompo-sition tree from scratch that agrees with π, and find its optimal orientation.In the context of heuristics, this search can be viewed as generalizing localsearches in which only swapping of pairs of vertices is allowed [P97]. Oursearch space allows swapping of blocks defined by the global hierarchicaldecomposition of the vertices. Many local searches lack quality guarantees,whereas our algorithm finds the best ordering in an exponential search space.
The complexity of our algorithm is exponential in the depth of the de-composition tree, and therefore, we phrase our results in terms of balanceddecomposition trees. The requirement that the binary decomposition treebe balanced does not incur a significant setback for the following reason.The analysis of divide-and-conquer algorithms, which construct a decompo-sition tree, attach a cost to the decomposition tree which serves as an upperbound on the cost of all orderings that agree with the decomposition tree.Even et al. [ENRS00] presented a technique for balancing binary decompo-sition trees. When this balancing technique is applied to minla the costof the balanced decomposition tree is at most three times the cost of theunbalanced decomposition tree. In the case of the cutwidth problem, thisbalancing technique can be implemented so that there is an ordering thatagrees with the unbalanced and the balanced decomposition trees.
Interestingly, our algorithm can be modified to find a worst solution thatagrees with a decomposition tree. We were not able to prove a gap betweenthe best and worst orientations of a decomposition tree, and therefore theapproximation factors for these vertex ordering problems has not been im-proved. However, we were able to give experimental evidence that for aparticular set of benchmark graphs the gap between the worst and bestorientations is roughly a factor of two.
Techniques. Our algorithm can be interpreted as a dynamic programmingalgorithm. The “table” used by the algorithm has entries 〈t, α〉, where t is abinary decomposition tree node, and α is a binary string of length depth(t),representing the assignments of orientations to the ancestors of t in thedecomposition tree. Note that the size of this table is exponential in the
Bar-Yehuda et al., Orientation of Decomposition Trees, JGAA, 5(4) 1–27 (2001)5
depth of the decomposition tree. If the decomposition tree has logarithmicdepth (i.e. the tree is balanced), then the size of the table is polynomial.
The contents of a table entry 〈t, α〉 is as follows. Let M denote the set ofleaves of the subtree rooted at t. The vertices in M constitute a contiguousblock in every ordering that agrees with the decomposition tree. Assigningorientations to the ancestors of t implies that we can determine the set L ofvertices that are placed to the left of M and the set R of vertices that areplaced to the right of M . The table entry 〈t, α〉 holds the minimum local costassociated with the block M subject to the orientations α of the ancestors oft. This local cost deals only with edges incident to M and only with the costthat these edges incur within the block M . When our algorithm terminates,the local cost of the root will be the total cost of an optimal orientation,since M contains every leaf of the tree.
Our ability to apply dynamic programming relies on a locality propertythat enables us to compute a table entry 〈t, α〉 based on four other tableentries. Let t1 and t2 be the children of t in the decomposition tree, andlet σ ∈ {0, 1} be a possible orientation of t. We show that it is possible toeasily compute 〈t, α〉 from the four table entries 〈ti, α · σ〉, where i ∈ {1, 2},σ ∈ {0, 1}.
The table described above can viewed as a structure called an orien-tations tree of a decomposition tree. Each internal node t = 〈t, α〉 of thisorientations tree corresponds to a node t of the decomposition tree, and astring α of the orientations of the ancestors of t in the decomposition tree.The children of t are the four children described above, so the value of eachnode of the orientations tree is locally computable from the value of its fourchildren. Thus, we perform a depth-first search of this tree, and to reducethe space complexity, we do not store the entire tree in memory at once.
Relation to previous work. For minla, Hansen [Ha89] proved that de-composition trees obtained by recursive α-approximate separators yields anO(α · logn) approximation algorithm. Since Leighton and Rao presented anO(log n)-approximate separator algorithm, an O(log2 n) approximation fol-lows. Even et al. [ENRS00] gave an approximation algorithm that achievesan approximation factor of O(log n log log n). Rao and Richa [RR98] im-proved the approximation factor to O(log n). Both algorithms rely on com-puting a spreading metric by solving a linear program with an exponentialnumber of constraints. For the cutwidth problem, Leighton and Rao [LR99]achieved an approximation factor of O(log2 n) by recursive separation. Theapproximation algorithms of [LR99, Ha89, ENRS00] compute binary 1/3-
Bar-Yehuda et al., Orientation of Decomposition Trees, JGAA, 5(4) 1–27 (2001)6
balanced decomposition trees so as to achieve the approximation factors.The algorithm of Rao and Richa [RR98] computes a non-binary non-
balanced decomposition tree. Siblings in the decomposition tree computedby the algorithm of Rao and Richa are given a linear order which maybe reversed (i.e. only two permutations are allowed for siblings) . Thismeans that the set of permutations that agree with such decompositiontrees are obtained by determining which “sibling orderings” are reversedand which are not. When the depth of the decomposition tree computedby the algorithm of Rao and Richa is super-logarithmic and the tree isnon-binary, we cannot apply the orientation algorithm since the balancingtechnique of [ENRS00] will create dependencies between the orientationsthat are assigned to roots of disjoint subtree.
Empirical Results. Our empirical results build on the work of Petit [P97].Petit collected a set of benchmark graphs and experimented with severalheuristics. The heuristic that achieved the best results was Simulated An-nealing. We conducted four experiments as follows. First, we computeddecomposition trees for the benchmark graphs using the HMETIS graphpartitioning package [GK98]. Aside from the random graphs in this bench-mark set, we showed a gap of roughly a factor of two between worst andbest orientations of the computed decomposition trees. This suggests thatfinding optimal orientations is practically useful. Second, we computed sev-eral decomposition trees for each graph by applying HMETIS several times.Since HMETIS is a random algorithm, it computes a different decompositioneach time it is invoked. Optimal orientations were computed for each de-composition tree. This experiment showed that our algorithm could be usedin conjunction with a partitioning algorithm to compute somewhat costliersolutions than Simulated Annealing at a fraction of the running time. Third,we experimented with the heuristic improvement algorithm suggested by us.We generated a random decomposition tree based on the ordering computedin the second experiment, then found the optimal orientation of this tree.Repeating this process yielded improved the results that were comparablewith the results of Petit. The running time was still less that SimulatedAnnealing. Finally, we used the best ordering computed as an initial solu-tion for Petit’s Simulated Annealing algorithm. As expected, this producedslightly better results than Petit’s results but required more time due to theplatform we used.
Bar-Yehuda et al., Orientation of Decomposition Trees, JGAA, 5(4) 1–27 (2001)7
Organization. In Section 2, we define the problems of minla and mincwas well as decomposition trees and orientations of decomposition trees. InSection 3, we present the orientation algorithm for minla and mincw. InSection 4 we propose a design of the algorithm with linear space complexityand analyze the time complexity of the algorithm. In Section 5 we describeour experimental work.
2 Preliminaries
2.1 The Problems
Consider a graph G(V,E) with non-negative edge capacities c(e). Let n =|V | and m = |E|. Let [i..j] denote the set {i, i + 1, . . . , j}. A one-to-one function π : V → [1..n] is called an ordering of the vertex set V . Wedenote the cut between the first i nodes and the rest of the nodes by cutπ(i),formally,
cutπ(i) = {(u, v) ∈ E : π(u) ≤ i and π(v) > i}.The capacity of cutπ(i) is denoted by c(cutπ(i)). The cutwidth of an orderingπ is defined by
cw(G,π) = maxi∈[1..n−1]
c(cutπ(i)).
The goal in the Minimum Cutwidth Problem (mincw) is to find an orderingπ with minimum cutwidth.
The goal in the Minimum Linear Arrangement Problem (minla) is tofind an ordering that minimizes the weighted sum of the edge lengths. For-mally, the the weighted sum of the edge lengths with respect an ordering πis defined by:
la(G,π) =∑
(u,v)∈E
c(u, v) · |π(u) − π(v)|.
The weighted sum of edge lengths can be equivalently defined as:
la(G,π) =∑
1≤i<n
c(cutπ(i))
2.2 Decomposition Trees
A decomposition tree of a graph G(V,E) is a rooted binary tree with a map-ping of the tree nodes to subsets of vertices as follows. The root is mappedto V , the subsets mapped to every two siblings constitute a partitioning of
Bar-Yehuda et al., Orientation of Decomposition Trees, JGAA, 5(4) 1–27 (2001)8
the subset mapped to their parent, and leaves are mapped to subsets con-taining a single vertex. We denote the subset of vertices mapped to a treenode t by V (t). For every tree node t, let T (t) denote the subtree of T ,the root of which is t. The inner cut of an internal tree node t is the set ofedges in the cut (V (t1), V (t2)), where t1 and t2 denote the children of t. Wedenote the inner cut of t by in cut(t).
Every DFS traversal of a decomposition tree induces an ordering of Vaccording to the order in which the leaves are visited. Since in each internalnode there are two possible orders in which the children can be visited,it follows that 2n−1 orderings are induced by DFS traversals. Each suchordering is specified by determining for every internal node which child isvisited first. In “graphic” terms, if the first child is always drawn as the leftchild, then the induced ordering is the order of the leaves from left to right.
2.3 Orientations and Optimal Orientations
Consider an internal tree node t of a decomposition tree. An orientation oft is a bit that determines which of the two children of t is considered as itsleft child. Our convention is that when a DFS is performed, the left child isalways visited first. Therefore, a decomposition tree, all the internal nodesof which are assigned orientations, induces a unique ordering. We refer toan assignment of orientations to all the internal nodes as an orientation ofthe decomposition tree.
Consider a decomposition tree T of a graph G(V,E). An orientation ofT is optimal with respect to an ordering problem if the cost associated withthe ordering induced by the orientation is minimum among all the orderingsinduced by T .
3 Computing An Optimal Orientation
In this section we present a dynamic programming algorithm for computingan optimal orientation of a decomposition tree with respect to minla andmincw. We first present an algorithm for minla, and then describe themodifications needed for mincw.
Let T denote a decomposition of G(V,E). We describe a recursive al-gorithm orient(t, α) for computing an optimal orientation of T for minla.A say that a decomposition tree is oriented if all its internal nodes are as-signed orientations. The algorithm returns an oriented decomposition treeisomorphic to T . The parameters of the algorithm are a tree node t andan assignment of orientations to the ancestors of t. The orientations of the
Bar-Yehuda et al., Orientation of Decomposition Trees, JGAA, 5(4) 1–27 (2001)9
ancestors of t are specified by a binary string α whose length equals depth(t).The ith bit in α signifies the orientation of the ith node along the path fromthe root of T to t.
The vertices of V (t) constitute a contiguous block in every ordering thatis induced by the decomposition tree T . The sets of vertices that appear tothe left and right of V (t) are determined by the orientations of the ancestorsof t (which are specified by α). Given the orientations of the ancestors of t,let L and R denote the set of vertices that appear to the left and right ofV (t), respectively. We call the partition (L, V (t), R) an ordered partition ofV .
Consider an ordered partition (L, V (t), R) of the vertex set V and anordering π of the vertices of V (t). Algorithm orient(t, α) is based on thelocal cost of edge (u, v) with respect to (L, V (t), R) and π. The local costapplies only to edges incident to V (t) and it measures the length of theprojection of the edge on V (t). Formally, the local cost is defined by
local costL,V (t),R,π(u, v) =
c(u, v) · |π(u) − π(v)| if u, v ∈ V (t)c(u, v) · π(u) if u ∈ V (t) and v ∈ Lc(u, v) · (|V (t)| − π(u)) if u ∈ V (t) and v ∈ R0 otherwise.
Note that the contribution to the cut corresponding to L ∪ V (t) is notincluded.
Algorithm orient(t, α) proceeds as follows:
1. If t is a leaf, then return T (t) (a leaf is not assigned an orientation).
2. Otherwise (t is not a leaf), let t1 and t2 denote children of t. Computeoptimal oriented trees for T (t1) and T (t2) for both orientations of t.Specifically,
(a) T 0(t1) = orient(t1, α · 0) and T 0(t2) = orient(t2, α · 0).(b) T 1(t1) = orient(t1, α · 1) and T 1(t2) = orient(t2, α · 1).
3. Let π0 denote the ordering of V (t) obtained by concatenating theordering induced by T 0(t1) and the ordering induced by T 0(t2). Letcost0 =
∑e∈E local costL,V (t),R,π0
(e).
4. Let π1 denote the ordering of V (t) obtained by concatenating theordering induced by T 1(t2) and the ordering induced by T 1(t1). Letcost1 =
∑e∈E local costL,V (t),R,π1
(e). (Note that here the vertices ofV (t2) are placed first.)
Bar-Yehuda et al., Orientation of Decomposition Trees, JGAA, 5(4) 1–27 (2001)10
5. If cost0 < cost1, the orientation of t is 0. Return the oriented tree T ′
the left child of which is the root of T 0(t1) and the right child of whichis the root of T 0(t2).
6. Otherwise (cost0 ≥ cost1), the orientation of t is 1. Return the orientedtree T ′ the left child of which is the root of T 1(t2) and the right childof which is the root of T 1(t1).
The correctness of the orient algorithm is summarized in the following claimwhich can be proved by induction.
Claim 1: Suppose that the orientations of the ancestors of t are fixed asspecified by the string α. Then, Algorithm orient(t, α) computes an optimalorientation of T (t). When t is the root of the decomposition tree and α is anempty string, Algorithm orient(t, α) computes an optimal orientation of T .
An optimal orientation for mincw can be computed by modifying thelocal cost function. Consider an ordered partition (L, V (t), R) and an or-dering π of V (t). Let Vi(t) denote that set {v ∈ V (t) : π(v) ≤ i}. Let E(t)denote the set of edges with at least one endpoint in V (t). Let i ∈ [0..|V (t)|].The ith local cut with respect to (L, V (t), R) and an ordering π of V (t) isthe set of edges defined by,
local cut(L,V (t),R),π(i) = {(u, v) ∈ E(t) : u ∈ L∪Vi(t) and v ∈ (V (t)−Vi(t))∪R}.
The local cutwidth is defined by,
local cw((L, V (t), R), π) = maxi∈[0..|V (t)|]
∑e∈local cut(L,V (t),R),π(i)
c(u, v).
The algorithm for computing an optimal orientation of a given decom-position tree with respect to mincw is obtained by computing costσ =local cw((L, V (t), R), πσ) in steps 3 and 4 for σ = 0, 1.
4 Designing The Algorithm
In this section we propose a design of the algorithm that has linear spacecomplexity. The time complexity is O(n2.2) if the graph has bounded degreeand the decomposition tree is 1/3-balanced.
Bar-Yehuda et al., Orientation of Decomposition Trees, JGAA, 5(4) 1–27 (2001)11
4.1 Space Complexity: The Orientation Tree
We define a tree, called an orientation tree, that corresponds to the recursiontree of Algorithm orient(t, α). Under the interpretation of this algorithm as adynamic program, this orientation tree represents the table. The orientationtree T corresponding to a decomposition tree T is a “quadary” tree. Everyorientation tree node t = 〈t, α〉 corresponds to a decomposition tree nodet and an assignment α of orientations to the ancestors of t. Therefore,every decomposition tree node t has 2depth(t) “images” in the orientationstree. Let t1 and t2 be the children of t in the decomposition tree, and letσ ∈ {0, 1} be a possible orientation of t. The four children of 〈t, α〉 are〈ti, α ·σ〉, where i ∈ {1, 2}, σ ∈ {0, 1}. Figure 1 depicts a decomposition treeand the corresponding orientation tree.
The time complexity of the algorithm presented in Section 3 is clearly atleast proportional to the size of the orientations tree. The number of nodesin the orientations tree is proportional to
∑v∈V 2depth(v), where the depth(v)
is the depth of v in the decomposition tree. This implies the running timeis at least quadratic if the tree is perfectly balanced. If we store the entireorientations tree in memory at once, our space requirement is also at leastquadratic. However, if we are a bit smarter about how we use space, andnever store the entire orientations tree in memory at once, we can reducethe space requirement to linear.
The recursion tree of Algorithm orient(root(T ), φ) is isomorphic to theorientation tree T . In fact, Algorithm orient(root(T ), α) assigns local coststo orientation tree nodes in DFS order. We suggest the following linearspace implementation. Consider an orientation tree node t = 〈t, α〉. Assumethat when orient(t, α) is called, it is handed a “workspace” tree isomorphicto T (t) which it uses for workspace as well as for returning the optimalorientation. Algorithm orient(t, α) allocates an “extra” copy of T (t), andis called recursively for each of it four children. Each call for a child isgiven a separate “workspace” subtree within the two isomorphic copies ofT (t) (i.e. within the workspace and extra trees). Upon completion of these4 calls, the two copies of T (t) are oriented; one corresponding to a zeroorientation of t and one corresponding to an orientation value of 1 for t.The best of these trees is copied into the “workspace” tree, if needed, andthe “extra” tree is freed. Assuming that one copy of the decomposition treeis used throughout the algorithm, the additional space complexity of thisimplementation satisfies the following recurrence:
space(t) = sizeof(T (t)) + maxt′ child of t
space(t′).
Bar-Yehuda et al., Orientation of Decomposition Trees, JGAA, 5(4) 1–27 (2001)12
left(0)
right(0)
left(1)
right(1)
right(0)right(1)
left(0)
left(1)right(1)
left(1)
left(0)
right(0)
Figure 1: A decomposition tree and the corresponding orientation tree. Thefour node decomposition tree is depicted as a gray shadow. The corre-sponding orientation tree is depicted in the foreground. The direction ofeach non-root node in the orientation tree corresponds to the orientation ofits parent (i.e. a node depicted as a fish swimming to the left signifies thatits parent’s orientation is left). The label of an edge entering a node in theorientation tree signifies whether the node is a left child or a right child andthe orientation of its parent.
Bar-Yehuda et al., Orientation of Decomposition Trees, JGAA, 5(4) 1–27 (2001)13
Since sizeof(T (t)) ≤ 2depth(T (t)), it follows that
space(t) ≤ 2 · 2depth(T (t)).
The space complexity of the proposed implementation of orient(T, α) is sum-marized in the following claim.
Claim 2: The space complexity of the proposed implementation is O(2depth(T )).If the decomposition tree is balanced, then space(root(T )) = O(n).
4.2 Time Complexity
Viewing Algorithm orient(t, α) as a DFS traversal of the orientation tree T (t)corresponding to T (t) also helps in designing the algorithm so that each visitof an internal orientation tree node requires only constant time. Supposethat each child of t is assigned a local cost (i.e. cost(left(σ)), cost(right(σ)),for σ = 0, 1). Let t1 and t2 denote the children of t. Let (L, V (t), R) denotethe ordered partition of V induced by the orientations of the ancestors of tspecified by α. The following equations define cost0 and cost1:
cost0 = cost(left(0)) + cost(right(0)) (1)+|V (t2)| · c(V (t1), R) + |V (t1)| · c(L, V (t2))
cost1 = cost(left(1)) + cost(right(1))+|V (t1)| · c(V (t2), R) + |V (t2)| · c(L, V (t1))
We now describe how Equation 1 can be computed in constant time. Letin cut(t) denote the cut (V (t1), V (t2)). The capacity of in cut(t) can bepre-computed by scanning the list of edges. For every edge (u, v), updatein cut(lca(u, v)) by adding c(u, v) to it.
We now define outer cuts. Consider the ordered partition (L, V (t), R)corresponding to an orientation tree node t. The left outer cut and the rightouter cut of t are the cuts (L, V (t)) and (V (t), R), respectively. We denotethe left outer cut by left cut(t) and the right out cut by right cut(t).
We describe how outer cuts are computed for leaves and for interiornodes. The capacity of the outer cuts of a leaf t are computed by consideringthe edges incident to t (since t is a leaf we identify it with a vertex in V ).For every edge (t, u), the orientation of the orientation tree node along thepath from the root to t that corresponds to lca(t, u) determines whether theedge belongs to the left outer cut or to the right outer cut. Since the leastcommon ancestors in the decomposition tree of the endpoints of every edge
Bar-Yehuda et al., Orientation of Decomposition Trees, JGAA, 5(4) 1–27 (2001)14
are precomputed when the inner cuts are computed, we can compute theouter cuts of a leaf t in O(deg(t)) time.
The outer cuts of a non-leaf t can be computed from the outer cuts ofits children as follows:
c(left cut(t)) = c(left cut(left(0))) + c(left cut(right(0))) − c(in cut(t))c(right cut(t)) = c(right cut(left(0))) + c(right cut(right(0))) − c(in cut(t))
Hence, the capacities of the outer cuts can be computed while the orientationtree is traversed. The following reformulation of Equation 1 shows that cost0and cost1 can be computed in constant time.
cost0 = cost(left(0)) + cost(right(0)) (2)+|V (t2)| · c(right cut(t1))) + |V (t1)| · c(left cut(t2)))
cost1 = cost(left(1)) + cost(right(1))+|V (t1)| · c(right cut(t2)) + |V (t2)| · c(left cut(t1))
Minimum Cutwidth. An adaptation of Equation 1 for mincw is givenbelow:
cost0 = max {local cw(L, V (t1), V (T2) ∪ R) + c(L, V (t2)), (3)local cw(L ∪ V (t1), V (T2), R) + c(V (t1), R)}
cost1 = max {local cw(L, V (t2), V (T1) ∪ R) + c(L, V (t1)),local cw(L ∪ V (t2), V (T1), R) + c(V (t2), R)}.
These costs can be computed in constant time using the same techniquedescribed for minla.
Now we analyze the time complexity of the proposed implementation ofAlgorithm orient(t, α). We split the time spent by the algorithm into twoparts: (1) The pre-computation of the inner cuts and the least commonancestors of the edges, (2) the time spent traversing the orientation tree T(interior nodes as well as leaves).
Precomputing the inner cuts and least common ancestors of the edgesrequires O(m ·depth(T )) time, where depth(T ) is the maximum depth of thedecomposition tree T . Assume that the least common ancestors are storedand need not be recomputed.
The time spent traversing the orientation tree is analyzed as follows. Weconsider two cases: Leaves - the amount of time spent in a leaf t is linear in
Bar-Yehuda et al., Orientation of Decomposition Trees, JGAA, 5(4) 1–27 (2001)15
the degree of t. Interior Nodes - the amount of time spent in an interior nodet is constant. This implies that the complexity of the algorithm is linear in
|T − leaves(T )| +∑
t∈leaves(T )
deg(t).
Every node t ∈ T has 2depth(t) “images” in T . Therefore, the complexity interms of the decomposition tree T equals∑
t∈T−leaves(T )
2depth(t) +∑
t∈leaves(T )
2depth(t) · deg(t).
If the degrees of the vertices are bounded by a constant, then the complexityis linear in ∑
t∈T
2depth(t).
This quantity is quadratic if T is perfectly balanced, i.e. the vertex sub-sets are bisected in every internal tree node. We quantify the balance of adecomposition tree as follows:
Definition 1: A binary tree T is ρ-balanced if for every internal node t ∈ Tand for every child t′ of t
ρ · |V (t)| ≤ |V (t′)| ≤ (1 − ρ) · |V (t)|
The following claim summarizes the time complexity of the algorithm.
Claim 3: If the decomposition tree T is ρ balanced, then∑t∈T
2depth(t) ≤ nβ,
where β is the solution to the equation
12
= ρβ + (1 − ρ)β (4)
Observe that if ρ ∈ (0, 1/2], then β ≥ 2. Moreover, β increases as ρ increases.Proof: The quantity which we want to bound is the size of the orientationtree. This quantity satisfies the following recurrence:
f(T (t)) 4=
{1 if t is a leaf2f(T (t1)) + 2f(T (t2)) otherwise.
Bar-Yehuda et al., Orientation of Decomposition Trees, JGAA, 5(4) 1–27 (2001)16
Define the function f∗ρ (n) as follows
f∗ρ (n) = max{f(T ) : T is α-balanced and has n leaves}.
The function f∗ρ (n) satisfies the following recurrence:
f∗ρ (n) 4=
{1 if n = 1max
{2f∗
ρ (n′) + 2f∗ρ (n − n′) : n′ ∈ [dρne..(n − dρne)]
}otherwise
We prove that f∗ρ (n) ≤ nβ by induction on n. The induction hypothesis, for
n = 1, is trivial. The induction step is proven as follows:
f∗ρ (n) = max
{2f∗
ρ (n′) + 2f∗ρ (n − n′) : n′ ∈ [dρne..(n − dρne)]
}≤ max
{2(n′)β + 2(n − n′)β : n′ ∈ [dρne..(n − dρne)]
}≤ max
{2(x)β + 2(n − x)β : x ∈ [ρn, n − ρn]
}= 2(ρn)β + 2(n − ρn)β
= 2nβ · (ρβ + (1 − ρ)β)= nβ
The first line is simply the recurrence that f∗ρ (n) satisfies; the second line
follows from the induction hypothesis; in the third line we relax the rangeover which the maximum is taken; the fourth line is justified by the convexityof xβ +(n−x)β over the range x ∈ [ρn, n− ρn] when β ≥ 2; in the fifth linewe rearrange the terms; and the last line follows from the definition of β. 2
We conclude by bounding the time complexity of the orientation algo-rithm for 1/3-balanced decomposition trees.
Corollary 4: If T is a 1/3-balanced decomposition tree of a bounded degree,then the orientation tree T of T has at most n2.2 leaves.
Proof: The solution of Equation (4) with ρ = 1/3 is β < 2.2. 2
For graphs with unbounded degree, this algorithm runs in time O(m ·2depth(T )) which is O(m · n1/ log2(1/1−ρ)). Alternatively, a slightly differentalgorithm gives a running time of nβ in general (not just for constant de-gree), but the space requirement of this algorithm is O(n log n). It is basedon computing the outer cuts of a leaf of the orientations tree on the waydown the recursion. We omit the details.
Bar-Yehuda et al., Orientation of Decomposition Trees, JGAA, 5(4) 1–27 (2001)17
5 Experiments And Heuristics
Since we are not improving the theoretical approximation ratios for theminla or mincw problems, it is natural to ask whether finding an optimalorientation of a decomposition tree is a useful thing to do in practice. Themost comprehensive experimentation on either problem was performed forthe minla problem by Petit [P97]. Petit collected a set of benchmark graphsand ran several different algorithms on them, comparing their quality. Hefound that Simulated Annealing yielded the best results. The benchmarkconsists of 5 random graphs, 3 “regular” graphs (a hypercube, a mesh, anda binary tree), 3 graphs from finite element discretizations, 5 graphs fromVLSI designs, and 5 graphs from graph drawing competitions.
5.1 Gaps between orientations
The first experiment was performed to check if there is a significant gapbetween the costs of different orientations of decomposition trees. Conve-niently, our algorithm can be easily modified to find the worst possible ori-entation; every time we compare two local costs to decide on an orientation,we simply take the orientation that achieves the worst local cost. In thisexperiment, we constructed decomposition trees for all of Petit’s benchmarkgraphs using the balanced graph partitioning program HMETIS [GK98].HMETIS is a fast heuristic that searches for balanced cuts that are as smallas possible. For each decomposition tree, we computed four orientations:
1. Naive orientation - all the decomposition tree nodes are assigned a zeroorientation. This is the ordering you would expect from a recursivebisection algorithm that ignored orientations.
2. Random orientation - the orientations of the decomposition tree nodesare chosen randomly to be either zero or one, with equal probability.
3. Best orientation - computed by our orientation algorithm.
4. Worst orientation - computed by a simple modification of our algo-rithm.
The results are summarized in Table 1. On all of the “real-life” graphs, thebest orientation had a cost of about half of the worst orientation. Further-more, the costs on all the graphs were almost as low as those obtained byPetit (Table 5), who performed very thorough and computationally-intensiveexperiments.
Bar-Yehuda et al., Orientation of Decomposition Trees, JGAA, 5(4) 1–27 (2001)18
Notice that the costs of the “naive” orientations do not compete wellwith those of Petit. This shows that in this case, using the extra degreesof freedom afforded by the decomposition tree was essential to achieving agood quality solution. These results also motivated further experimentationto see if we could achieve comparable or better results using more iterations,and the improvement heuristic alluded to earlier.
5.2 Experiment Design
We ran the following experiments on the benchmark graphs:
1. Decompose & Orient. Repeat the following two steps k times, andkeep the best ordering found during these k iterations.
(a) Compute a decomposition tree T of the graph. The decompo-sition is computed by calling the HMETIS graph partitioningprogram recursively [GK98]. HMETIS accepts a balance param-eter ρ. HMETIS is a random algorithm, so different executionsmay output different decomposition trees.
(b) Compute an optimal orientation of T . Let π denote the orderinginduced by the orientation of T .
2. Improvement Heuristic. Repeat the following steps k′ times (or untilno improvement is found during 10 consecutive iterations):
(a) Let π0 denote the best ordering π found during the Decompose& Orient step.
(b) Set i = 0.(c) Compute a random ρ-balanced decomposition tree T ′ based on πi.
The decomposition tree T ′ is obtained by recursively partitioningthe blocks in the ordering πi into two contiguous sub-blocks. Thepartitioning is a random partition that is ρ-balanced.
(d) Compute an optimal orientation of T ′. Let πi+1 denote the or-dering induced by the orientation of T ′. Note that the cost ofπi+1 is not greater than the cost of the ordering πi. Increment iand return to Step 2c, unless i = k′−1 or no improvement in thecost has occurred for the last 10 iterations.
3. Simulated Annealing. We used the best ordering found by the Heuris-tic Improvement stage as an input to the Simulated Annealing programof Petit [P97].
Bar-Yehuda et al., Orientation of Decomposition Trees, JGAA, 5(4) 1–27 (2001)19
Preliminary experiments were run in order to choose the balance parameterρ. In graphs of less than 1000 nodes, we simply chose the balance parameterthat gave the best ordering. In bigger graphs, we also restricted the balanceparameter so that the orientation tree would not be too big. The parametersused for the Simulated Annealing program were also chosen based on a fewpreliminary experiments.
5.3 Experimental Environment
The programs have been written in C and compiled with a gcc compiler. Theprograms were executed on a dual processor Intel P-III 600MHz computerrunning under Red Hat Linux. The programs were only compiled for oneprocessor, and therefore only ran on one of the two processors.
5.4 Experimental Results
The following results were obtained:
1. Decompose & Orient. The results of the Decompose & Orient stageare summarized in Table 2. The columns of Table 2 have the followingmeaning: “UB Factor” - the balance parameter used when HMETISwas invoked. The relation between the balance parameter ρ and theUB Factor is (1/2− ρ) · 100. “Avg. OT Size” - the average size of theorientation tree corresponding to the decomposition tree computed byHMETIS. This quantity is a good measure of the running time with-out overheads such as reading and writing of data. “Avg. L.A. Cost”- the average cost of the ordering induced by an optimal orientation ofthe decomposition tree computed by HMETIS. “Avg. HMETIS Time(sec)” - the average time in seconds required for computing a decompo-sition tree, “Avg. Orienting Time (sec)” - the average time in secondsrequired to compute an optimal orientation, and “Min L.A. Cost” -the minimum cost of an ordering, among the computed orderings.
2. Heuristic Improvement. The results of the Heuristic Improvementstage are summarized in Table 3. The columns of Table 3 have thefollowing meaning: “Initial solution” - the cost of the ordering com-puted by the Decompose & Orient stage, “10 Iterations” - the cost ofthe ordering obtained after 10 iterations of Heuristic Improvements,“Total Iterations” - The number iterations that were run until therewas no improvement for 10 consecutive iterations, “Avg. Time per It-eration” - the average time for each iteration (random decomposition
Bar-Yehuda et al., Orientation of Decomposition Trees, JGAA, 5(4) 1–27 (2001)20
and orientation), and “Final Cost” - the cost of the ordering computedby the Heuristic Improvement stage.
The same balance parameter was used in the Decompose & Orientstage and in the Heuristic Improvement stage. Since the partition ischosen randomly in the Heuristic Improvement stage such that thebalance is never worse than ρ, the decomposition trees obtained wereshallower. This fact is reflected in the shorter running times requiredfor computing an optimal orientation.
3. Simulated Annealing. The results of the Simulated Annealing programare summarized in Table 4.
5.5 Conclusions
Table 5 summarizes our results and compares them with the results of Pe-tit [P97]. The running times in the table refer to a single iteration (therunning time of the Decompose & Orient stage refers only to orientationnot including decomposition using HMETIS). Petit ran 10-100 iterations onan SGI Origin 2000 computer with 32 MIPS R10000 processors.
Our main experimental conclusion is that running our Decompose &Orient algorithm usually yields results within 5-10% of Simulated Annealing,and is significantly faster. The results were improved to comparable withSimulated Annealing by using our Improvement Heuristic. Slightly betterresults were obtained by using our computed ordering as an initial solutionfor Simulated Annealing.
5.6 Availability of programs and results
Programs and output files can be downloaded fromhttp://www.eng.tau.ac.il/∼guy/Projects/Minla/index.html.
Bar-Yehuda et al., Orientation of Decomposition Trees, JGAA, 5(4) 1–27 (2001)21
Gra
ph
Naiv
eRandom
Wor
stB
est
Bes
t/W
ors
t
random
A1
1012
523
1010
880
1042
034
9806
310.
94
random
A2
6889
176
6908
714
6972
740
6842
173
0.98
random
A3
1478
0970
1476
7044
1482
6303
1470
9068
0.99
random
A4
1913
623
1907
549
1963
952
1866
510
0.95
random
G4
2108
1722
0669
2822
8015
7716
0.56
hc1
052
3776
5237
7652
3776
5237
761.
mes
h33
x33
5572
755
084
6272
035
777
0.57
bin
tree
1048
5050
3790
2137
420.
41
3elt
7201
7472
9782
8872
3141
9128
0.47
airfoi
l152
1014
5242
0167
7191
3372
370.
5
whitak
er3
2173
667
2423
313
3233
350
1524
974
0.47
c1y
9113
586
362
1410
3464
365
0.46
c2y
1228
3810
8936
1723
9881
952
0.48
c3y
1529
9316
0114
2623
7413
1612
0.5
c4y
1461
7816
6764
2461
0412
0799
0.49
c5y
1211
9113
7291
2035
9410
3888
0.51
gd95
c73
370
295
355
50.
58
gd96
a13
2342
1369
7317
7945
1211
400.
68
gd96
b25
3824
7528
9915
330.
53
gd96
c69
571
986
854
30.
63
gd96
d34
7736
6346
8226
140.
56
Tab
le1:
Aco
mpa
riso
nof
arra
ngem
entco
stsfo
rdi
ffere
ntor
ient
atio
nsof
asi
ngle
deco
mpo
sition
tree
forea
chgr
aph.
Bar-Yehuda et al., Orientation of Decomposition Trees, JGAA, 5(4) 1–27 (2001)22
Gra
ph
Pro
per
ties
Dec
om
pose
&O
rien
tIter
atio
ns
Gra
ph
Nodes
Edges
UB
Fact
or
Avg
.O
TSiz
eA
vg.
L.A
.Cost
Avera
ge
HM
ET
IST
ime
(sec)
Avera
ge
Ori
enti
ng
Tim
e(s
ec)
Min
L.A
Cost
random
A1
1000
4974
15
1689096
976788
15.0
85.3
2956837
random
A2
1000
24738
10
1430837
6829113
31.4
617.8
96791813
random
A3
1000
49820
15
2682997
14677190
51.7
170.8
914621489
random
A4
1000
8177
15
1868623
1870265
17.4
18.2
21852192
random
G4
1000
8173
15
1952092
157157
18.0
78.2
8156447
hc1
01024
5120
10
699051
523776
13.7
21.9
8523776
mes
h33x3
31089
2112
10
836267
35880
10.4
51.1
535728
bin
tree
10
1023
1022
10
913599
3741
7.5
20.8
73740
3el
t4720
13722
15
34285107
423225
54.9
360.9
3414051
airfo
il14253
12289
16
30715196
328936
48.4
553.3
0325635
whitake
r39800
28989
10
110394027
1462858
114.5
0192.5
01441887
c1y
828
1749
10
709463
68458
7.5
11.5
163803
c2y
980
2102
15
1456839
82615
9.4
05.6
381731
c3y
1327
2844
10
1902661
133027
12.4
64.2
4130213
c4y
1366
2915
10
1954874
120899
13.2
43.9
7119016
c5y
1202
2557
10
1528117
100990
12.3
93.5
999772
gd95c
62
144
20
7322
520
0.5
10
512
gd96a
1096
1676
10
1151024
117431
9.3
62.0
8112551
gd96b
111
193
20
38860
1502
0.7
70.1
41457
gd96c
65
125
20
4755
544
0.4
90
533
gd96d
180
228
15
48344
2661
1.1
60.1
12521
Tab
le2:
Res
ults
ofth
eD
ecom
pose
&O
rien
tst
age.
100
iter
atio
nsw
ere
exec
uted
for
allt
hegr
aphs
,exc
ept
for
3elt
and
airf
oil1
-60
iter
atio
ns,an
dw
hita
ker3
-25
iter
atio
ns.
Bar-Yehuda et al., Orientation of Decomposition Trees, JGAA, 5(4) 1–27 (2001)23
Gra
ph
Pro
per
ties
Heu
rist
icIm
pro
vem
ent
Iter
atio
ns
Gra
ph
Initia
lso
lution
10
Iter
ations
20
Iter
ations
30
Iter
ations
Tota
lIt
era
tions
Avg.
Tim
e/
Itera
tion
(sec)
Fin
alCost
random
A1
956837
947483
941002
935014
1048
2.6
3897910
random
A2
6791813
6759449
6736868
6717108
500
9.8
46604738
random
A3
14621489
14576923
14537348
14506843
200
23.7
314365385
random
A4
1852192
1833589
1821592
1812344
557
3.7
31761666
random
G4
156447
150477
149456
149243
76
3.5
3149185
hc1
0523776
523776
--
10
2.0
0523776
mes
h33x3
335728
35492
35325
35212
152
1.2
834845
bin
tree
10
3740
3724
3720
3718
47
0.7
53714
3el
t414051
400469
395594
392474
724
46.4
9372107
airfo
il1325635
313975
309998
307383
630
36.9
4292761
whitake
r31441887
1408337
1393045
1381345
35
149.4
31376511
c1y
63803
63283
63085
62973
80
0.7
362903
c2y
81731
81094
80877
80679
171
1.1
980109
c3y
130213
129255
128911
128650
267
1.9
1127729
c4y
119016
118109
117690
117438
160
2.1
4116621
c5y
99772
99051
98813
98661
182
1.7
298004
gd95c
512
506
--
14
0.0
0506
gd96a
112551
110744
109605
108887
977
1.0
7102294
gd96b
1457
1427
1424
1417
38
0.0
21417
gd96c
533
520
520
-20
0.0
0520
gd96d
2521
2463
2454
2449
62
0.0
32436
Tab
le3:
Res
ults
ofth
eH
euri
stic
Impr
ovem
ent
stag
e.B
alan
cepa
ram
eter
seq
ualth
eU
Bfa
ctor
sin
Tab
le2.
Bar-Yehuda et al., Orientation of Decomposition Trees, JGAA, 5(4) 1–27 (2001)24
SA
resu
lts
Gra
ph
Initia
lso
lution
Tot
alR
unnin
gT
ime
(sec
)Fin
alCost
random
A1
8979
1023
28.1
488
4261
random
A2
6604
738
1916
4565
7691
2
random
A3
1436
5385
5877
1.8
1428
9214
random
A4
1761
666
1089
5.6
1747
143
random
G4
1491
8583
75.8
814
6996
hc1
052
3776
2545
.16
5237
76
mes
h33x3
334
845
278.
037
3353
1
bin
tree
10
3714
59.7
8237
62
3el
t37
2107
5756
.52
3632
04
airfo
il129
2761
4552
.14
2892
17
whitake
r313
7651
129
936.
812
0037
4
c1y
6290
320
4.84
962
333
c2y
8010
929
8.84
479
571
c3y
1277
2955
3.06
412
7065
c4y
1166
2157
9.75
111
5222
c5y
9800
444
5.60
496
956
gd95c
506
1.90
950
6
gd96a
1022
9426
8.45
499
944
gd96b
1417
2.92
314
22
gd96c
520
1.17
551
9
gd96d
2436
4.16
324
09
Tab
le4:
Res
ults
ofth
eSi
mul
ated
Ann
ealin
gpr
ogra
m.
The
para
met
ers
wer
et0
=4,
tl=
0.1,
alph
a=0.
99ex
cept
for
whi
tack
er3,
airf
oil1
and
3elt
for
whi
chal
pha=
0.97
5an
dra
ndom
A3
for
whi
chtl=
1an
dal
pha=
0.8
Bar-Yehuda et al., Orientation of Decomposition Trees, JGAA, 5(4) 1–27 (2001)25
Gra
ph
Pet
it’s
Res
ults
[P97]
Dec
om
pose
&O
rien
tst
age
Heu
rist
icIm
pro
vem
ent
stage
Sim
ula
ted
Annea
ling
stage
cost
tim
e(s
ec)
cost
%diff
.tim
e(s
ec)
cost
%diff
.tim
e(s
ec)
cost
%diff
.tim
e(s
ec)
random
A1
900992
317
956837
6.2
0%
5897910
-0.3
4%
3884261
-1.8
6%
2328
random
A2
6584658
481
6791813
3.1
5%
18
6604738
0.3
0%
10
6576912
-0.1
2%
191645
random
A3
14310861
682
14621489
2.1
7%
71
14365385
0.3
8%
24
14289214
-1.5
1%
58771
random
A4
1753265
346
1852192
5.6
4%
81761666
0.4
8%
41747143
-0.3
5%
10896
random
G4
150490
346
156447
3.9
6%
8149185
-0.8
7%
4146996
-2.3
2%
8376
hc1
0548352
335
523776
-4.4
8%
2523776
-4.4
8%
2523776
-4.4
8%
2545
mes
h33x3
334515
336
35728
3.5
1%
134845
0.9
6%
133531
-2.8
5%
278
bin
tree
10
4069
291
3740
-8.0
9%
13714
-8.7
2%
13762
-7.5
4%
60
3el
t375387
6660
414051
10.3
0%
61
372107
-0.8
7%
46
363204
-3.2
5%
5757
airfoil1
288977
5364
325635
12.6
9%
53
292761
1.3
1%
37
289217
0.0
8%
4552
whitak
er3
1199777
3197
1441887
20.1
8%
193
1376511
14.7
3%
149
1200374
0.0
5%
29937
c1y
63854
196
63803
-0.0
8%
262903
-1.4
9%
162333
-2.3
8%
205
c2y
79500
277
81731
2.8
1%
680109
0.7
7%
179571
0.0
9%
299
c3y
124708
509
130213
4.4
1%
4127729
2.4
2%
2127065
1.8
9%
553
c4y
117254
535
119016
1.5
0%
4116621
-0.5
4%
2115222
-1.7
3%
580
c5y
102769
416
99772
-2.9
2%
498004
-4.6
4%
296956
-5.6
6%
446
gd95c
509
1512
0.5
9%
0506
-0.5
9%
0506
-0.5
9%
2gd96a
104698
341
112551
7.5
0%
2102294
-2.3
0%
199944
-4.5
4%
268
gd96b
1416
31457
2.9
0%
01417
0.0
7%
01422
0.4
2%
3
gd96c
519
1533
2.7
0%
0520
0.1
9%
0519
0.0
0%
1gd96d
2393
82521
5.3
5%
02436
1.8
0%
02409
0.6
7%
4
Tab
le5:
Com
pari
son
ofre
sults
and
runn
ing
tim
es.
Not
e(a
)T
ime
ofD
ecom
pose
&O
rien
tan
dH
euri
stic
Impr
ove-
men
tar
egi
ven
per
iter
atio
n.(b
)D
ecom
pose
&O
rien
ttim
edo
esno
tin
clud
etim
efo
rre
curs
ive
deco
mpo
sition
.
Bar-Yehuda et al., Orientation of Decomposition Trees, JGAA, 5(4) 1–27 (2001)26
6 Acknowledgments
We would like to express our gratitude to Zvika Brakerski for his help withusing HMETIS, running the experiments, generating the tables, and writingscripts.
References
[BL84] S.N. Bhatt and F.T. Leighton, “A framework for solving VLSIgraph layout problems”, JCSS, Vol. 28, pp. 300-343, (1984).
[ENRS00] G. Even, J. Naor, S. Rao, and B. Schieber, “Divide-and-ConquerApproximation Algorithms via Spreading Metrics”, Journal of theACM, Volume 47, No. 4, pp. 585 - 616, Jul. 2000.
[ENRS99] G. Even, J. Naor, S. Rao, and B. Schieber, “Fast approximategraph partitioning algorithms”, SIAM Journal on Computing. Vol-ume 28, Number 6, pp. 2187-2214, 1999.
[FD] Frost, D., and Dechter, R. ”Maintenance scheduling problems asbenchmarks for constraint algorithms” To appear in Annals ofMath and AI.
[GJ79] M.R. Garey and D.S. Johnson, “Computers and intractability: aguide to the theory of NP-completeness”, W. H. Freeman, SanFrancisco, California, 1979.
[GK98] G. Karypis and V. Kumar, “hMeTiS - a hy-pergraph partitioning package”, http://www-users.cs.umn.edu/∼karypis/metis/hmetis/files/manual.ps.
[Ha89] M. Hansen, “Approximation algorithms for geometric embeddingsin the plane with applications to parallel processing problems,”30th FOCS, pp. 604-609, 1989.
[HL99] Sung-Woo Hur and John Lillis, “Relaxation and clustering in a lo-cal search framework: application to linear placement”, Proceed-ings of the 36th ACM/IEEE conference on Design AutomationConference, pp. 360 - 366, 1999.
[K93] Richard M. Karp, “Mapping the genome: some combinatorialproblems arising in molecular biology”, Proceedings of the twenty-
Bar-Yehuda et al., Orientation of Decomposition Trees, JGAA, 5(4) 1–27 (2001)27
fifth annual ACM Symposium on Theory of Computing, pp. 278-285, 1993
[LR99] F.T. Leighton and S. Rao, “Multicommodity max-flow min-cuttheorems and their use in designing approximation algorithms”,Journal of the ACM, Volume 46, No. 6, pp. 787 - 832, Nov. 1999.
[P97] J. Petit, “Approximation Heuristics and Benchmarkings forthe MinLA Problem”, Tech. Report LSI-97-41-R, Departamentde Llenguatges i Sistemes Informatics. Universitat Politecnicade Catalunya, http://www.lsi.upc.es/dept/techreps/html/R97-41.html, 1997.
[RR98] S. Rao and A. W. Richa, “New approximation techniques for someordering problems”, Proceedings of the 9th Annual ACM-SIAMSymposium on Discrete Algorithms, 1998, pp. 211-218.
[RAK91] R. Ravi, A. Agrawal and P. Klein, “Ordering problems approxi-mated: single processor scheduling and interval graph completion”,18th ICALP, pp. 751-762, 1991.
[R70] D. J. Rose, “Triangulated graphs and the elimination process”, J.Math. Appl., 32 pp. 597-609, 1970.