Crossing the Structure Chasm Alon Halevy University of Washington, Seattle UCLA, April 15, 2004.
CSE 326: Data Structures Lecture #17 Priority Queues Alon Halevy Spring Quarter 2001.
-
date post
22-Dec-2015 -
Category
Documents
-
view
219 -
download
1
Transcript of CSE 326: Data Structures Lecture #17 Priority Queues Alon Halevy Spring Quarter 2001.
CSE 326: Data StructuresLecture #17
Priority Queues
Alon Halevy
Spring Quarter 2001
Binary Heap Priority Q Data Structure
201412911
81067
54
2
• Heap-order property– parent’s key is less than
children’s keys
– result: minimum is always at the top
• Structure property– complete tree with fringe
nodes packed to the left
– result: depth is always O(log n); next open location always known
DeleteMin
201412911
81067
54
?2
201412911
81067
54
2
pqueue.deleteMin()
Percolate Down
201412911
81067
54
?
201412911
81067
5?
4
201412911
810?7
56
4
201420911
810127
56
4
Finally…
1420911
810127
56
4
DeleteMin CodeObject deleteMin() {
assert(!isEmpty());
returnVal = Heap[1];
size--;
newPos =
percolateDown(1,
Heap[size+1]);
Heap[newPos] =
Heap[size + 1];
return returnVal;
}
int percolateDown(int hole, Object val) {while (2*hole <= size) { left = 2*hole; right = left + 1; if (right <= size && Heap[right] < Heap[left]) target = right; else target = left;
if (Heap[target] < val) { Heap[hole] = Heap[target]; hole = target; } else break; } return hole;}
runtime:
Insert
201412911
81067
54
2
201412911
81067
54
2
pqueue.insert(3)
?
Percolate Up
201412911
81067
54
2
? 201412911
8?67
54
2
10
201412911
8567
?4
2
10 201412911
8567
34
2
10
3
3
3
Insert Codevoid insert(Object o) {
assert(!isFull());
size++;
newPos =
percolateUp(size,o);
Heap[newPos] = o;
}
int percolateUp(int hole, Object val) { while (hole > 1 && val < Heap[hole/2]) Heap[hole] = Heap[hole/2]; hole /= 2; } return hole;}
runtime:
Performance of Binary Heap
• In practice: binary heaps much simpler to code, lower constant factor overhead
Binary heap
worst case
Binary heap avg case
AVL tree worst case
AVL tree avg case
Insert O(log n) O(1)
percolates 1.6 levels
O(log n) O(log n)
Delete Min
O(log n) O(log n) O(log n) O(log n)
Changing Priorities
• In many applications the priority of an object in a priority queue may change over time– if a job has been sitting in the printer queue for a long
time increase its priority– unix “renice”– Alon’s story
• Must have some (separate) way of find the position in the queue of the object to change (e.g. a hash table)
Other Priority Queue Operations
• decreaseKey – given the position of an object in the queue, reduce its priority
value
• increaseKey– given the position of an an object in the queue, increase its priority
value
• remove– given the position of an object in the queue, remove it
• buildHeap– given a set of items, build a heap
DecreaseKey, IncreaseKey, and Remove
void decreaseKey(int pos, int delta){
temp = Heap[pos] - delta;
newPos = percolateUp(pos, temp);
Heap[newPos] = temp;
}
void increaseKey(int pos, int delta) {
temp = Heap[pos] + delta;
newPos = percolateDown(pos, temp);
Heap[newPos] = temp;
}
void remove(int pos) { percolateUp(pos, NEG_INF_VAL); deleteMin();}
BuildHeapFloyd’s Method. Thank you, Floyd.
5 11 3 10 6 9 4 8 1 7 212
pretend it’s a heap and fix the heap-order property!
27184
96103
115
12Easy worst case bound:
Easy average case bound:
Build(this)Heap
67184
92103
115
12
671084
9213
115
12
1171084
9613
25
12
1171084
9653
21
12
Finally…
11710812
9654
23
1
runtime?
Complexity of Build Heap• Note: size of a perfect binary tree doubles (+1)
with each additional layer• At most n/4 percolate down 1 level
at most n/8 percolate down 2 levelsat most n/16 percolate down 3 levels…
log
11
log
1
1 2 3 ...4 8 16 2
(2)2 2 2
n
ii
n
ii
n n n ni
n i nn
O(n)
Thinking about Heaps
• Observations– finding a child/parent index is a multiply/divide by two
– operations jump widely through the heap
– each operation looks at only two new nodes
– inserts are at least as common as deleteMins
• Realities– division and multiplication by powers of two are fast
– looking at one new piece of data terrible in a cache line
– with huge data sets, disk accesses dominate
4
9654
23
1
8 1012
7
11
Solution: d-Heaps
• Each node has d children• Still representable by array• Good choices for d:
– optimize performance based on # of inserts/removes
– choose a power of two for efficiency
– fit one set of children on a memory page/disk block
3 7 2 8 5 12 11 10 6 9112
What do d-heaps remind us of???
New Operation: Merge
Given two heaps, merge them into one heap– first attempt: insert each element of the smaller heap
into the larger.
runtime:
– second attempt: concatenate heaps’ arrays and run buildHeap.
runtime:
How about O(log n) time?
Idea: Hang a New Tree
1213106
115
2
+
1014
49
1
=
141213106
49115
12
?
10
Now, just percolate down!
Note: we just gave up the nice structural property on binary heaps!
Problem?
Need some other kind of balance condition…
2+ 4
1
=
2
?
12
1515
18
4
1
1215
1815
Leftist Heaps
• Idea:
make it so that all the work you have to do in maintaining a heap is in one small part
• Leftist heap:– almost all nodes are on the left
– all the merging work is on the right
the null path length (npl) of a node is the number of nodes between it and a null in the tree
Random Definition:Null Path Length
• npl(null) = -1• npl(leaf) = 0• npl(single-child node) = 0
000
001
11
2
0
Leftist Heap Properties
• Heap-order property– parent’s priority value is to childrens’ priority values
– result: minimum element is at the root
• Leftist property– null path length of left subtree is npl of right subtree
– result: tree is at least as “heavy” on the left as the right
Are leftist trees complete? Balanced?
Leftist tree examples
NOT leftist leftist
00
001
11
2
0
0
000
11
2
1
000
0
0
0
0
0
1
0
leftist
0
every subtree of a leftist tree is leftist, comrade!
Right Path in a Leftist Tree is Short• If the right path has length
at least r, the tree has at least 2r - 1 nodes
• Proof by inductionBasis: r = 1. Tree has at least one node: 21 - 1 = 1
Inductive step: assume true for r’ < r. The right subtree has a right path of at least r - 1 nodes, so it has at least 2r - 1 - 1 nodes. The left subtree must also have a right path of at least r - 1 (otherwise, there is a null path of r - 3, less than the right subtree). Again, the left has 2r - 1 - 1 nodes. All told then, there are at least:
2r - 1 - 1 + 2r - 1 - 1 + 1 = 2r - 1
• So, a leftist tree with at least n nodes has a right path of at most log n nodes
0
000
11
2
1
00
Merging Two Leftist Heaps
• merge(T1,T2) returns one leftist heap containing all elements of the two (distinct) leftist heaps T1 and T2
a
L1 R1
b
L2 R2
mergeT1
T2
a < b
a
L1
recursive merge
b
L2 R2
R1
Merge Continued
a
L1 R’
R’ = Merge(R1, T2)
a
R’ L1
npl(R’) > npl(L1)
constant work at each merge; recursively traverse RIGHT right path of each tree; total work = O(log n)
Operations on Leftist Heaps• merge with two trees of total size n: O(log n)• insert with heap size n: O(log n)
– pretend node is a size 1 leftist heap– insert by merging original heap with one node heap
• deleteMin with heap size n: O(log n)– remove and return root– merge left and right subtrees
merge
merge
Example
1210
5
87
3
14
1
0 0
1
0 0
0
merge
7
3
14
?
0
0
1210
5
8
1
0 0
0
merge
10
5?
0 merge
12
8
0
0
8
12
0
0
Sewing Up the Example
8
12
0
0
10
5?
0
7
3
14
?
0
0
8
12
0
0
10
51
0
7
3
14
?
0
08
12
0
0
10
5 1
0
7
3
14
2
0
0
Done?
Finally…
8
12
0
0
10
5 1
0
7
3
14
2
0
0
7
3
14
2
0
08
12
0
0
10
5 1
0
Skew Heaps• Problems with leftist heaps
– extra storage for npl
– two pass merge (with stack!)
– extra complexity/logic to maintain and check npl
• Solution: skew heaps– blind adjusting version of leftist heaps
– amortized time for merge, insert, and deleteMin is O(log n)
– worst case time for all three is O(n)
– merge always switches children when fixing right path
– iterative method has only one pass
What do skew heaps remind us of?
Merging Two Skew Heaps
a
L1 R1
b
L2 R2
mergeT1
T2
a < b
a
L1
merge
b
L2 R2
R1
Notice the old switcheroo!
Example
1210
5
87
3
14
merge
7
3
141210
5
8
merge7
3
1410
5
8
merge12
7
3
14108
5
12
Skew Heap Codevoid merge(heap1, heap2) {
case {heap1 == NULL: return heap2;heap2 == NULL: return heap1;heap1.findMin() < heap2.findMin():
temp = heap1.right;heap1.right = heap1.left;heap1.left = merge(heap2, temp);return heap1;
otherwise:return merge(heap2, heap1);
}}
Comparing Heaps
• Binary Heaps
• d-Heaps
• Binomial Queues
• Leftist Heaps
• Skew Heaps