previous up next
Go backward to 6.5 Special Functions
Go up to 6 More on Functions
RISC-Linz logo

6.6 Asymptotic Bounds

When discussing the execution time of an algorithm, we are often sloppy by stating that an algorithm takes, for problem size n, not more than n2 steps while it may actually run in 7n2+3n+19 machine instructions. However, in some sense the sequence [n2]n already bounds the growth rate of [7n2+3n+19]n; the factor 7. the addendum 19, and even the term 3n do not contribute to this rate any more. This concept of "asymptotic function bounds" is generally formalized as follows.

Definition 96 (Big O Quantor) For every variable n and terms S and T, the phrase
S = On(T)
is a proposition with bound variable n which is read as "S is big O of T" or "T  asymptotically dominates (dominiert asymptotisch) S".

Its meaning is equivalent to the proposition

exists c in R, m in N: forall n >= m: |S| <= c*|T|.

Usually the subscript n is dropped and the bound variable has to be deduced from the context.

The proposition S = On(T) states that, from a certain point m on and scaled by some factor c, the absolute value of T is at least as large as the absolute value of S. The function [T]n thus grows at least as fast as [S]n.

For instance, instead of saying that the execution time T of some algorithm A in dependence on input size n is TA(n) = 7n2+3n+19, we may say

TA(n) = O(n2),
i.e., that the execution time is asymptotically dominated by the square function and ignore all constant factors and minor terms.

Sometimes, we also see propositions like

S0 = S1 + O(T)
which is to be understood as S0 - S1 = O(T). For instance, by saying
f(n) = 5n3 + 2n2 + O(n)
we state that f(n) differs from 5n3 + 2n2 not more than by some linear factor.


It is easy to derive a number of laws for the manipulation of O terms.

Proposition 110 (O Manipulation) For every variable n, terms S and T, c in R, and d in R with d >= c, we can replace the following terms on the left hand side by the corresponding terms on the right hand side:
S -> On(S),
c+On(S) -> On(S),
c*On(S) -> On(S),
On(S) + On(S) -> On(S),
On(On(S)) -> On(S),
On(S) + On(T) -> On(|S|+|T|),
On(S) * On(T) -> On(S * T),
On(S) * On(T) -> S*On(T),
On(Sc) -> On(Sd).

Thus we can replace a statement

f(n) = 2n2 - 7n + 5
via the transformations
2n2 - 7n + 5 -> 2*O(n2) - 7*O(n) + 5*O(1) ->
   O(n2) + O(n) + O(1) -> O(n2) + O(n2) + O(n2) -> O(n2)
by the statement
f(n) = O(n2).

This example illustrates that the asymptotic behavior of a polynomial sequence is asymptotically dominated by the polynomial term with the highest degree.

Proposition 111 (Asymptotic of Polynomial Sequences) Every polynomial sequence of degree n is dominated by [xn]x:
forall n in N, a: Nn -> R:
   (sum0 <= i <= n axi) = On(xn).

For comparing the asymptotic behavior of classes of real functions, the following notation comes handy.

Definition 97 (Strictly Dominated) For every variable n and terms S and T, the phrase
S < n T
is a proposition with bound variable n which is read as "S is strictly dominated by T" and that is equivalent to
S = On(T) /\  T != On(S).

We then have the following chain of function classes:

Proposition 112 (Asymptotic Classes) For every k in N, c in R, and a in R>0, we have:
1 < n loga(n) < n n < n nloga(n) < n nk< n cn < n n!

The corresponding function graphs are visualized below.

To get some "feeling" for the growth rate of these functions, assume that we have a machine that performs 106 operations per second. We are given six algorithms whose number of execution steps in dependence of the problem size n is determined by one of the six functions denoted above. Then the maximum size of a problem that can be solved in a fixed amount of time is

Execution Steps 1 second 1 minute 1 hour
log2(n) 2106 26*107 236*108
n 106 6*107 3.6*109
nlog2(n) 62746 2.8*106 1.3*108
n2 1000 7746 60000
2n 23 26 32
n! 9 11 12

A sorting algorithm that takes, for some logarithmic base, O(nlog(n)) steps (such as Quicksort) is therefore much better than an n2 algorithm (such as Insertion Sort); the difference in computation time may be not significant for small inputs but the larger the inputs become, the more the asymptotic complexity distinguishes the algorithms.

Also we can see that, for algorithms with exponential (or higher) complexity, the only problems of extremely small size can be solved. Furthermore, it does not help much to allow say 1000 times more execution time (or use a 1000 times faster machine or apply a parallel computer with 1000 processors); the manageable problem size does not increase significantly. Problems for which only exponential time algorithms exist are therefore called  intractable (ungefügig). For instance, to find out whether a propositional formula with n variables is a tautology is such an intractable problem.

Author: Wolfgang Schreiner
Last Modification: October 4, 1999

previous up next