CSE 326 Heaps and the Priority Queue ADT
description
Transcript of CSE 326 Heaps and the Priority Queue ADT
CSE 326Heaps and the Priority Queue ADT
David Kaplan
Dept of Computer Science & EngineeringAutumn 2001
HeapsCSE 326 Autumn 2001
2
Back to Queues Some applications
ordering CPU jobs simulating events picking the next search site
Problems? short jobs should go first earliest (simulated time) events should go
first most promising sites should be searched
first
HeapsCSE 326 Autumn 2001
3
Priority Queue ADT Priority Queue operations
create destroy insert deleteMin is_empty
Priority Queue property: for two elements in the queue, x and y, if x has a lower priority value than y, x will be deleted before y
F(7) E(5) D(100) A(4)
B(6)
insert deleteMinG(9) C(3)
HeapsCSE 326 Autumn 2001
4
Applications of the Priority Q Hold jobs for a printer in order of length Store packets on network routers in
order of urgency Simulate events Select symbols for compression Sort numbers Anything greedy
HeapsCSE 326 Autumn 2001
5
Naïve Priority Q Data Structures Unsorted list:
insert:
deleteMin:
Sorted list: insert:
deleteMin:
HeapsCSE 326 Autumn 2001
6
Binary Search TreePriority Q Data Structure (aka BST PQD :-)
4
121062
115
8
14
13
7 9
insert:
deleteMin:
HeapsCSE 326 Autumn 2001
7
Binary HeapPriority Q Data Structure
131415911
81067
54
2
Heap-order property parent’s key is less
than children’s keys result: minimum is
always at the top
Structure property complete tree: fringe
nodes packed to the left
result: depth is always O(log n); next open location always known
How do we find the minimum?
HeapsCSE 326 Autumn 2001
8
131415911
81067
54
2
2 4 5 7 6 10 8 11 9 15 14 1313
1 2 3 4 5 6 7 8 9 10 11 12
1
2 3
4 5 6 7
8 9 10 11 12
Nifty Storage TrickCalculations
children: parent: root: next free:
0 13
Note: Walking the array in index order gives us level-order traversal!!!
HeapsCSE 326 Autumn 2001
9
BinaryHeap::DeleteMin
131415911
81067
54
?2
131415911
81067
54
2pqueue.deleteMin()
Percolate Down
131415911
81067
54
?
1415911
81067
5
1415911
8107
56
4
13
1415911
8107
513
4
13
4
6
Done!
HeapsCSE 326 Autumn 2001
11
DeleteMin CodeObject deleteMin() {
assert(!isEmpty());
returnVal = Heap[1];
size--;
newPos =
percolateDown(1,
Heap[size+1]);
Heap[newPos] =
Heap[size + 1];
return returnVal;
}
intpercolateDown(int hole, Object val) {while (2*hole <= size) { left = 2*hole; right = left + 1; if (right <= size && Heap[right] < Heap[left]) target = right; else target = left;
if (Heap[target] < val) { Heap[hole] = Heap[target]; hole = target; } else break; } return hole;}
runtime:
HeapsCSE 326 Autumn 2001
12
BinaryHeap::Insert
201412911
81067
54
2
201412911
81067
54
2 pqueue.insert(3)
?
Percolate Up
201412911
81067
54
2
? 201412911
8?67
54
2
10
201412911
8567
?4
2
10 201412911
8567
34
2
10
3
3
3
HeapsCSE 326 Autumn 2001
14
Insert Code
void insert(Object o) {
assert(!isFull());
size++;
newPos =
percolateUp(size,o);
Heap[newPos] = o;
}
int percolateUp(int hole, Object val) { while (hole > 1 && val < Heap[hole/2]) Heap[hole] = Heap[hole/2]; hole /= 2; } return hole;}
runtime:
HeapsCSE 326 Autumn 2001
15
Performance of Binary Heap
In practice: binary heaps much simpler to code, lower constant factor overhead
Binary heapworst case
Binary heap avg case
AVL tree worst case
AVL tree avg case
Insert O(log n) O(1)percolates ~1.6 levels
O(log n) O(log n)
DeleteMin
O(log n) O(log n) O(log n) O(log n)
HeapsCSE 326 Autumn 2001
16
Changing PrioritiesIn many applications the priority of an object in a priority queue may change over time
if a job has been sitting in the printer queue for a long time increase its priority
unix “renice” Sysadmin may raise priority of a critical task
Must have some (separate) way to find the position in the queue of the object to change (e.g. a hash table)
No log(N) find (as with BSTs) – why not?
HeapsCSE 326 Autumn 2001
17
Other Priority Queue OperationsdecreaseKey
given a pointer to an object in the queue, reduce its priority value
increaseKey given a pointer to an object in the queue, increase its
priority value
remove given a pointer to an object in the queue, remove it
buildHeap given a set of items, build a heap
HeapsCSE 326 Autumn 2001
18
DecreaseKey, IncreaseKey, Removevoid decreaseKey(int obj) {
assert(size >= obj);
temp = Heap[obj];
newPos = percolateUp(obj, temp);
Heap[newPos] = temp;
}
void remove(int obj) { assert(size >= obj); percolateUp(obj, NEG_INF_VAL); deleteMin();}
void increaseKey(int obj) {
assert(size >= obj);
temp = Heap[obj];
newPos = percolateDown(obj, temp);
Heap[newPos] = temp;
}
Note: changeKey functions assume that key value has already been changed!
HeapsCSE 326 Autumn 2001
19
BuildHeap (Floyd’s Method)
5 11 3 10 6 9 4 8 1 7 212
pretend it’s a heap and fix the heap-order property!
27184
96103
115
12
Thank you, Floyd!
Build(this)Heap
67184
92103
115
12
671084
9213
115
12
1171084
9613
25
12
1171084
9653
21
12
HeapsCSE 326 Autumn 2001
21
Finish Build(ing)(this)Heap
11710812
9654
23
1
Runtime?
HeapsCSE 326 Autumn 2001
22
Complexity of Build Heap Note: size of a perfect binary tree
doubles (+1) with each additional layer At most n/4 percolate down 1 level
at most n/8 percolate down 2 levelsat most n/16 percolate down 3 levels…
O(n)
nninn
innn N
ii
N
ii
2
22...
163
82
41
log
1
log
11
22
HeapsCSE 326 Autumn 2001
23
Thinking about HeapsObservations
finding a child/parent index is a multiply/divide by two operations jump widely through the heap each operation looks at only two new nodes inserts are at least as common as deleteMins
Realities division and multiplication by powers of two are fast looking at one new piece of data sucks in a cache line with huge data sets, disk accesses dominate
HeapsCSE 326 Autumn 2001
24
4
9654
23
1
8 1012
7
11
Solution: d-Heaps Each node has d children Still representable by
array Good choices for d:
optimize performance based on # of inserts/removes
d = 2k for efficiency (array index calcs)
fit one set of children in a cache line
fit one set of children on a memory page/disk block
3 7 2 8 5 12 11 10 6 9112
What do d-heaps remind us of???
HeapsCSE 326 Autumn 2001
25
Merging HeapsGiven two heaps, merge them into one
heap first attempt: insert each element of the
smaller heap into the larger. runtime:
second attempt: concatenate heaps’ arrays and run buildHeap.
runtime:
How about O(log n) time?
HeapsCSE 326 Autumn 2001
26
Solution: Leftist HeapsIdea
Localize all maintenance work in one small part of the heap
Leftist heap: almost all nodes are on the left all the merging work is on the right
HeapsCSE 326 Autumn 2001
27
The null path length (NPL) of a node is the number of nodes between it and a null in the tree
Null Path Length
npl(null) = -1
npl(leaf) = 0
npl(single-child node) = 0
000
001
11
2
another way of looking at it:NPL is the height of complete subtree rooted at this node
0
HeapsCSE 326 Autumn 2001
28
Leftist Heap PropertiesHeap-order property
parent’s priority value childrens’ priority values minimum element is at the root
Leftist property nodes, NPL(left subtree) NPL(right subtree) tree is at least as “heavy” on the left as the
right
Are leftist trees complete? Balanced? Socialist?
HeapsCSE 326 Autumn 2001
29
Leftist Tree Examples
NOT leftist leftist
00
001
11
2
0
0
000
11
2
1
000
0
0
0
0
0
1
0
leftist
0
every subtree of a leftist tree is leftist, comrade!
HeapsCSE 326 Autumn 2001
30
Right Path in a Leftist Tree is ShortTheorem:
If right path-length is at least r, the tree has at least 2r - 1 nodes
Proof by induction:Basis: r = 1. Tree has at least one node: 21 - 1 = 1
Inductive step: Assume true for r’< r.Right subtree has a right path of at least r - 1 nodes, so it has at least 2(r-1) - 1 nodes.Left subtree must also have a right path of at least r - 1 (otherwise, there is a null path of r - 3, less than the right subtree).Again, the left has 2(r-1) - 1 nodes. All told then, there are at least:
2(r-1) - 1 + 2(r-1) - 1 + 1 = 2r - 1
Leftist tree with at least n nodes has right path of at most log n nodes
HeapsCSE 326 Autumn 2001
31
Merging Two Leftist HeapsMerge(T1,T2) returns one leftist heap containing all elements of the two (distinct) leftist heaps T1 and T2
a
L1 R1
b
L2 R2
T1
T2
a < b
a
L1
merge
b
L2 R2
R1
merge
HeapsCSE 326 Autumn 2001
32
Merge Continued
a
L1 R’
R’ = Merge(R1, T2)
a
R’ L1
npl(R’) > npl(L1)
runtime:
HeapsCSE 326 Autumn 2001
33
Operations on Leftist Heapsmerge two trees of total size n: O(log n)insert into heap size n: O(log n)
pretend node is a size 1 leftist heap insert by merging original heap
with one node heap
deleteMin with heap size n: O(log n) remove and return root merge left and right subtrees
merge
merge
Merge Example
1210
5
87
3
14
1
0 0
1
0 0
0
merge
7
3
14
?
0
0
1210
5
8
1
0 0
0
merge
10
5?
0 merge
12
8
0
0
8
12
0
0
HeapsCSE 326 Autumn 2001
35
Sewing Up the Example
8
12
0
0
10
5?
0
7
3
14
?
0
0
8
12
0
0
10
51
0
7
3
14
?
0
08
12
0
0
10
5 1
0
7
3
14
2
0
0
Done?
HeapsCSE 326 Autumn 2001
36
Finally
8
12
0
0
10
5 1
0
7
3
14
2
0
0
7
3
14
2
0
08
12
0
0
10
5 1
0
HeapsCSE 326 Autumn 2001
37
Iterative Leftist Merge
1210
5
87
3
14
1
0 0
1
0 0
0
merge
8
12
0
0
10
5 1
0
7
3
14
2
0
0
Downward Pass Merge right
paths
8
12
0
0
10
5 1
0
7
3
14
2
0
08
12
0
0
10
5 1
0
7
3
14
2
0
08
12
0
0
10
5 1
0
7
3
14
2
0
0
7
3
14
2
0
08
12
0
0
10
5 1
0
What do we need to do leftist merge iteratively?
Iterative Leftist Merge (part deux)
Upward Pass Fix-up Leftist Heap
Property
HeapsCSE 326 Autumn 2001
39
(One More)
Amortized Timeam·or·tize To write off an expenditure for (office equipment, for example) by prorating over a certain period.
time A nonspatial continuum in which events occur in apparently irreversible succession from the past through the present to the future.
am·or·tized time Running time limit resulting from writing off expensive runs of an algorithm over multiple cheap runs of the algorithm, usually resulting in a lower overall running time than indicated by the worst possible case.
If M operations take total O(M log N) time, amortized time per operation is O(log N)
HeapsCSE 326 Autumn 2001
40
Skew Heaps Problems with leftist heaps
extra storage for NPL two pass merge (with stack!) extra complexity/logic to maintain and check NPL
Solution: skew heaps blind adjusting version of leftist heaps amortized time for merge, insert, and deleteMin is O(log
n) worst case time for all three is O(n) merge always switches children when fixing right path iterative method has only one pass
HeapsCSE 326 Autumn 2001
41
Merging Two Skew Heaps
a
L1 R1
b
L2 R2
mergeT1
T2
a < b
a
L1
merge
b
L2 R2
R1
Skew Heap Example
1210
5
87
3
14
merge7
3
141210
5
8
merge7
3
1410
5
8
merge12
7
3
14108
5
12
HeapsCSE 326 Autumn 2001
43
Skew Heap Codevoid merge(heap1, heap2) {
case {
heap1 == NULL: return heap2;
heap2 == NULL: return heap1;
heap1.findMin() < heap2.findMin():
temp = heap1.right;
heap1.right = heap1.left;
heap1.left = merge(heap2, temp);
return heap1;
otherwise:
return merge(heap2, heap1);
}
}
HeapsCSE 326 Autumn 2001
44
Heaps o’ HeapsPros Cons
Binary Heaps Simple, fastBuildHeap
No easy merge
d-Heaps Large data sets More complex
Leftist Heaps Merge More complexExtra storage
Skew Heaps MergeSimple codeGood amortized speed
Binomial Queues
? ? (check out Weiss 6.8)