Monday 11 January 2010

Algorithms - Complexity

Complexity


Note: You really need to read the algorithms text book chapters 1 to 3 as well as reading these notes!

Second note: This will probably be quite short as I'm cutting out all the exposition from the notes and just including the important information.

Analysing Algorithms


For any algorithm we are analysing 2 main things:

  • Correctness

  • and Complexity


Complexity is about the resources that the algorithm is using. There are 2 resources is might use, those are:

  • Time (Run time)

  • Space (How much memory does it use)


Run time is important because some things can't be delayed, eg software for a fighter jet.

Run time depends heavily on the input, but not just it's size. Normally a larger input = larger run time.

The thing that matters most is rate of growth with input size.

Functions and Big O


We often write algorithms as functions. For example we might say that algorithm A is f(n) = 2n.

Now let's say we have 2 functions:

  • f(n) = 10n

  • g(n) = 20n


In Big O notation these are the same, as they both have the same order (power of n). So we can write both functions in Big O notation as O(n).

If we had h(n) = 2n², in Big O notation that would be O(n²).

If you plot different functions onto a graph you'll be able to see which functions are the fastest by looking at how fast they grow.

For example, for f(n) = 10n, with an input of 50, the function would return 500. Putting 50 into h(n) - 2n² would return 5000.

Growth Rates in Order



  • log n (slowest growth rate)

  • √n

  • n


  • 2^n

  • n! (highest growth rate)


The last 2 are said to be intractable, meaning that they normally have extremely large run times. As in 'longer-than-the-age-of-the-universe' run times.

Simplifying Big O


The rule is: Find the fastest growing term and write it in it's simplest form.

So, O(5n³ + log n) becomes O(n³).

That's all for the first set of notes. I think there's about 6 in total.

No comments:

Post a Comment