COMPUTATIONAL COMPLEXITY

Computational complexity is the study of how much of a given resource a program uses. The resource in question is usually either space (how much memory) or time (how many basic operations).

Complexity can be looked at from many angles:

Space vs. Time Complexity

In general, the amount of space used is less than the amount of time used because space can be reused and time cannot. For most applications, we focus on time complexity, since memory tends to be cheap compared to time (thee days). Sometimes, space is still an issue - particularly if the problem size is very large and the solution requires relatively low time complexity. (An example would be sorting very large (on the order of one-billion element) data sets.)

A simple space example - adding two arrays of integers. Assume our array size is N and A,B are input arrays.

  1. The following code requires N+N+N=3N space:
    int C[N];
    for (int i=0; i < N; i++)
      C[i] = A[i] + B[i];
          
  2. Suppose we no longer need B - we can reuse it. The following code requires only N+N=2N space:
    for (int i=0; i < N; i++)
      B[i] = A[i] + B[i];
          

Running-time analysis

Operations required by Bag members
member function approximate numbers of operations
size_t size() 1
void insert(int entry) 2
size_t occurrences(int target) 4n + 3
void grab() // slow version 3n + 5
void grab() // fast version 5


Big-O Notation

Definition: function f(n) is O(g(n)) if there exist constants k and N such that for all n>=N: f(n) <= k * g(n).

(The notation is often confusing: f = O(g) is read "f is big-oh of g.")

As a simple example, let f(n)=2n and g(n)=n. Is f=O(g)? Yes, we can pick k=2 and N=1 and then the definition is easily satisfied since 2n <=n. Although simple, this example illustrates a key property of big-oh: it ignores constants.

Generally, when we see a statement of the form f(n)=O(g(n)):

Below are some more big-oh examples:
Growth Rates of Functions
f n
1 2 4 10 100 1,000
7 7 7 7 7 7 7
lg(n) 0 1 2 3.3 6.6 9.9
sqrt(n) 1 1.4 2 3.2 10 32
n 1 2 4 10 100 1,000
n2 1 4 16 100 10,000 1,000,000
n3 1 8 64 1,000 1,000,000 109
2n 2 4 16 1,024 1030 10300
n! 1 2 24 3,628,800 10158 102568

Why `big-oh'?