Theory of Algorithms: Fundamentals of the Analysis of Algorithm Efficiency
The Efficiency of Algorithms
description
Transcript of The Efficiency of Algorithms
![Page 1: The Efficiency of Algorithms](https://reader036.fdocuments.net/reader036/viewer/2022062315/56815b1c550346895dc8ceda/html5/thumbnails/1.jpg)
The Efficiency of Algorithms
Chapter 7
![Page 2: The Efficiency of Algorithms](https://reader036.fdocuments.net/reader036/viewer/2022062315/56815b1c550346895dc8ceda/html5/thumbnails/2.jpg)
2
Chapter Contents
Motivation Measuring an Algorithm's Efficiency
Big Oh Notation Formalities Picturing Efficiency The Efficiency of Implementations of the ADT
List The Array-Based Implementation The Linked Implementation Comparing Implementations
![Page 3: The Efficiency of Algorithms](https://reader036.fdocuments.net/reader036/viewer/2022062315/56815b1c550346895dc8ceda/html5/thumbnails/3.jpg)
3
Measuring Algorithm Efficiency Algorithm has both time and space requirements called
complexity to measure Types of complexity
Space complexity Time complexity
Analysis of algorithms The measuring of either time/space complexity of an algorithm
Measure the time complexity since it is more important Cannot compute actual time for an algorithm. Give function of problem size that is directly proportional to time
requirement: growth-rate function Function measures how the time requirement grows as the
problem size grows. We usually measure worst-case time
![Page 4: The Efficiency of Algorithms](https://reader036.fdocuments.net/reader036/viewer/2022062315/56815b1c550346895dc8ceda/html5/thumbnails/4.jpg)
4
Measuring Algorithm Efficiency
Three algorithms for computing 1 + 2 + … n for an integer n > 0
![Page 5: The Efficiency of Algorithms](https://reader036.fdocuments.net/reader036/viewer/2022062315/56815b1c550346895dc8ceda/html5/thumbnails/5.jpg)
5
Measuring Algorithm Efficiency
The number of operations required by the algorithms
![Page 6: The Efficiency of Algorithms](https://reader036.fdocuments.net/reader036/viewer/2022062315/56815b1c550346895dc8ceda/html5/thumbnails/6.jpg)
6
Measuring Algorithm Efficiency
The number of operations required by the algorithms as a function of n
![Page 7: The Efficiency of Algorithms](https://reader036.fdocuments.net/reader036/viewer/2022062315/56815b1c550346895dc8ceda/html5/thumbnails/7.jpg)
7
Big Oh Notation
Computer scientists use a notation to represent an algorithm’s complexity.
To say "Algorithm A has a worst-case time requirement proportional to n" We say A is O(n) Read "Big Oh of n" or “order of at most n”
For the other two algorithms Algorithm B is O(n2) Algorithm C is O(1)
![Page 8: The Efficiency of Algorithms](https://reader036.fdocuments.net/reader036/viewer/2022062315/56815b1c550346895dc8ceda/html5/thumbnails/8.jpg)
8
Big Oh Notation
Tabulates magnitudes of typical growth-rate functions evaluated at increasing values of n
When analyzing the time efficiency of an algorithm, consider larger problems. For small problems, the difference between the execution time is usually insignificant.
Grows in magnitude from left to right…
![Page 9: The Efficiency of Algorithms](https://reader036.fdocuments.net/reader036/viewer/2022062315/56815b1c550346895dc8ceda/html5/thumbnails/9.jpg)
9
Formalities Formal mathematical definition of Big Oh
An algorithm's time requirement f(n) is of order at most g(n)
Big Oh provides an upper bound on a function’s growth rate.
f(n) = O(g(n)) That is, if a positive real number c and positive
integer N exist such that
f(n) ≤ c•g(n) for all n ≥ N
c•g(n) is the upper bound on f(n) when n is sufficiently large.
![Page 10: The Efficiency of Algorithms](https://reader036.fdocuments.net/reader036/viewer/2022062315/56815b1c550346895dc8ceda/html5/thumbnails/10.jpg)
10
Formalities
An illustration of the definition of Big Oh
![Page 11: The Efficiency of Algorithms](https://reader036.fdocuments.net/reader036/viewer/2022062315/56815b1c550346895dc8ceda/html5/thumbnails/11.jpg)
11
Example Show that f(n) = 5*n + 3 = O(n)
g(n) = n, c = 6, and N = 3 f(n) <= 6 g(n)
Why don’t we let g(n) = n^2 ? Let g(n) = n^2, c=8, N =1
Although the conclusion is correct, it is not as tight as possible.
You want the upper bound to be as small as possible, and you want it to involve simple functions.
![Page 12: The Efficiency of Algorithms](https://reader036.fdocuments.net/reader036/viewer/2022062315/56815b1c550346895dc8ceda/html5/thumbnails/12.jpg)
12
Formalities
The following identities hold for Big Oh notation: O(k * f(n)) = O(f(n)) O(f(n)) + O(g(n)) = O(f(n) + g(n)) O(f(n)) * O(g(n)) = O(f(n) *g(n))
By using these identities and ignoring smaller terms in a growth rate function, you can determine the order of complexity with little efforts.
O(4*n^2 + 50*n -10) = O(4*n^2) = O(n^2)
![Page 13: The Efficiency of Algorithms](https://reader036.fdocuments.net/reader036/viewer/2022062315/56815b1c550346895dc8ceda/html5/thumbnails/13.jpg)
13
Picturing Efficiency
Body of loop requires a constant amount of time O(1)
an O(n) algorithm.
![Page 14: The Efficiency of Algorithms](https://reader036.fdocuments.net/reader036/viewer/2022062315/56815b1c550346895dc8ceda/html5/thumbnails/14.jpg)
14
Picturing Efficiency
An O(n2) algorithm.
![Page 15: The Efficiency of Algorithms](https://reader036.fdocuments.net/reader036/viewer/2022062315/56815b1c550346895dc8ceda/html5/thumbnails/15.jpg)
15
Picturing Efficiency
Another O(n2) algorithm.
![Page 16: The Efficiency of Algorithms](https://reader036.fdocuments.net/reader036/viewer/2022062315/56815b1c550346895dc8ceda/html5/thumbnails/16.jpg)
16
Question?
for i = 1 to n
{ for j = 1 to 5
sum = sum +1;
}
Using Gig Oh notation, what is the order of the computation time?
![Page 17: The Efficiency of Algorithms](https://reader036.fdocuments.net/reader036/viewer/2022062315/56815b1c550346895dc8ceda/html5/thumbnails/17.jpg)
17
Get a Feel for Growth-rate Functions
The effect of doubling the problem size on an algorithm's time requirement.
![Page 18: The Efficiency of Algorithms](https://reader036.fdocuments.net/reader036/viewer/2022062315/56815b1c550346895dc8ceda/html5/thumbnails/18.jpg)
18
Get a Feel for Growth-rate Functions
The time to process one million of problem size by algorithms of various orders at the rate of one million operations per second.
A programmer can use O(n2), O(n3) or O(2n) as long as the problem size is small
![Page 19: The Efficiency of Algorithms](https://reader036.fdocuments.net/reader036/viewer/2022062315/56815b1c550346895dc8ceda/html5/thumbnails/19.jpg)
19
Efficiency of Implementations of ADT List
For array-based implementation Add to end of list O(1) Add to list at given position O(n)
For linked implementation Add to end of list O(n)/O(1) Add to list at given position O(n) Retrieving an entry O(n)
![Page 20: The Efficiency of Algorithms](https://reader036.fdocuments.net/reader036/viewer/2022062315/56815b1c550346895dc8ceda/html5/thumbnails/20.jpg)
20
Comparing Implementations
The time efficiencies of the ADT list operations for two implementations, expressed in Big Oh notation
![Page 21: The Efficiency of Algorithms](https://reader036.fdocuments.net/reader036/viewer/2022062315/56815b1c550346895dc8ceda/html5/thumbnails/21.jpg)
21
Choose Implementation for ADT Consider the operations that your application
requires A particular operation frequently, its
implementation has to be efficient. Conversely, rarely use an operation, you can
afford to use one that has an inefficient implementation.
![Page 22: The Efficiency of Algorithms](https://reader036.fdocuments.net/reader036/viewer/2022062315/56815b1c550346895dc8ceda/html5/thumbnails/22.jpg)
22
Typical Growth-rate function 1: implies a problem whose time requirement is constant and, therefore,
independent of problem size n. Log2n: time requirement for a logarithmic algorithm increase slowly as
the problem size increases. If you square the problem size, you only double its time.
n: time requirement for a linear algorithm increases directly with the size of problem. If you squire the problem size, you also squire its time requirement.
n* Log2n: increases more rapidly than a linear algorithm. Such problem usually divide a problem into smaller problems that are each solved separately.
n^2: the time requirement for a quadratic algorithm increases rapidly with the size of the problem. Algorithms that use two nested loops are often quadratic. Such algorithm are practical only for small problem.
n^3: cubic algorithm increases more rapidly than quadratic algorithm. Algorithms that use three nested loops are often cubic.
2^n: the time requirement for exponential algorithm usually increases too rapidly to be practical.
![Page 23: The Efficiency of Algorithms](https://reader036.fdocuments.net/reader036/viewer/2022062315/56815b1c550346895dc8ceda/html5/thumbnails/23.jpg)
23
Exercises Using Big Oh notation, indicate the time requirement of each of
the following tasks in the worst case. Describe any assumptions that you make. a. After arriving at a party, you shake hands with each person
there. b. Each person in a room shakes hands with everyone else in
the room. c. You climb a flight of stairs. d. You slide down the banister. e. After entering an elevator, you press a button to choose a
floor. f. You ride the elevator from the ground floor up to the nth floor. g. You read a book twice.
![Page 24: The Efficiency of Algorithms](https://reader036.fdocuments.net/reader036/viewer/2022062315/56815b1c550346895dc8ceda/html5/thumbnails/24.jpg)
24
Exercises Suppose that your implementation of a particular algorithm appears in
Java as follows: for (int pass = 1; pass <= n; pass++) {
for (int index = 0; index < n; index++) {
for (int count = 1; count < 10; count++) {
. . . } // end for
} // end for } // end for The algorithm involves an array of n items. The previous code shows
the only repetition in the algorithm, but it does not show the computations that occur within the loops.
These computations, however, are independent of n. What is the order of the algorithm?
![Page 25: The Efficiency of Algorithms](https://reader036.fdocuments.net/reader036/viewer/2022062315/56815b1c550346895dc8ceda/html5/thumbnails/25.jpg)
25
Exercises
What order is an algorithm that has as a growth-rate function of: a. 8*n^3 -9*n b. 7*log2 n + 20 c. 9*log2 n + n d. n*log2 n + n^2 e. log2 (log2 n) + 3*log2 n + 4