Growth Rate of Functions
OK, so this post is mainly because some people ‘begged’ me to put it on. From the examples you may have – hopefully? – seen in a previous post – if you haven’t here it is: http://computersciencesource.wordpress.com/2009/09/09/comp10042-computation-time-space-complexity/ .
You have seen in these examples a number of different expressions representing the time complexity of different program fragments in terms of problem size, usually ‘n’. In the table below, we list the values of various integer-valued expressions for ‘problem size’ ‘n’ in the range of 1 to 5:
Problem size n | Alg 1 (n/3) + 3 | Alg 2 3n + 2 | Alg 3 n2/2 | Alg 4 2n2 - 1 | Alg 5 (n3/3) + 1 |
n=1 | 3 | 5 | 0 | 1 | 1 |
n=2 | 3 | 8 | 2 | 7 | 3 |
n=3 | 4 | 11 | 4 | 17 | 10 |
n=4 | 4 | 14 | 8 | 31 | 22 |
n=5 | 4 | 17 | 12 | 49 | 42 |
Let’s assume that these expressions are the pseudo-time performance expressions for different algorithms solving a given problem. Now, you may assume from the values that the algorithm with expression (n/3) + 3 would be the one to choose, no matter how big the problem. But is this really the case?
It could simply be that the algorithm we later discover is actually flawed itself and doesn't do the right thing - gives the wrong answers. So we would have to fix, and perhaps when fixed, we find the new pseudo time performance is not so great.
Another reason might be that whilst the algorithm is correct, it is rather greedy, in terms of space, and we find we run out of memory when running problem of larger sizes.
Now, let’s consider the next for expressions. Looking at the algorithm with pseudo-time performance n2/2, we can see that it is clearly superior to all the others. Even though you may think that this algorithm appears to have a quadratic time performance, it is still far better than the linear ones, such as 3n + 2. Let’s have 2 more values of ‘n’:
n | (n/3) + 3 | 3n + 2 | n2/2 | 2n2 - 1 | (n3/3) + 1 |
… | |||||
6 | 5 | 20 | 18 | 71 | 73 |
7 | 5 | 23 | 24 | 97 | 115 |
We would usually consider the performance of Alg 1 and Alg 2 to be essentially the same; indeed we would say they are of the same order. Why is that the case? Well, as we mentioned briefly above, the performance of Alg 1 grows at the same rate as the performance of Alg 2 with respect to increasing problem sizes. For Alg 1 and Alg 2, there is just a constant factor difference, i.e. the Alg 2 performance is, effectively, 9 times slower that Alg 1, no matter what problem size we consider. On the other hand, if we look at the performance graphs of Alg 2 and Alg 4, we can see that the performance difference is growing as we increase the problem size. Alg 4 is just over 2 times slower than Alg 2 on a problem of size 4, but at problem size 8 it is around 5 times slower!
What about the performance of Alg 3 and Alg 4? Well, the performances of these two algorithms change at the same rate. We can see from the graphs that Alg 4 is 4 times slower than Alg 3.
Here’s a quick summary from what you should see in this graph:
- The rates of growth of the values of expressions (x/3)+3 and 3x+2 are of the same order (linear).
- The rates of growth of the values of expressions (x2/2) and (2x2-1) are also of the same order, and have a faster growth rate than them mentioned above. The order of these is quadratic.
- The value of the expression (x3/3)+1 grows at a faster rate than any of the other 4 expressions. The order of this is cubic.
A Small Interlude… :)
A function ‘f’ over one argument ‘x’ is called a polynomial function if it satisfies:
- f(x) = anxn + an-1xn-1 + ···· + a1x + a0
For all x, n is non-negative integer, and an, an-1, … a1, a0 are constant coefficients.
A polynomial function ‘f’ is of degrees ‘n’ if ‘n’ is the largest integer such that an != 0.
Informally, we then have the following general result about rates of growth:
- All polynomials of degree ‘n’ are said to be of the same order.
- All polynomials of degree ‘n’ have faster rates of growth over all polynomials of degree n-1.
No comments:
Post a Comment