Go backward to 6.5 Special FunctionsGo up to 6 More on Functions |

is a proposition with bound variableS= O_{n}(T)

Its meaning is equivalent to the proposition

existscinR,minN:foralln>=m: |S| <=c*|T|.

Usually the subscript `n` is dropped and the bound variable has to be
deduced from the context.

The proposition `S` =
O_{n}(`T`) states that, from a certain point `m` on
and scaled by some factor `c`, the absolute value of `T` is at
least as large as the absolute value of `S`. The function
[`T`]_{n} thus grows at least as fast as [`S`]_{n}.

For instance, instead of saying that the execution time `T` of some
algorithm `A` in dependence on input size `n` is
`T`_{A}(`n`) = 7`n`^{2}+3`n`+19, we may say

i.e., that the execution time is asymptotically dominated by the square function and ignore all constant factors and minor terms.T_{A}(n) = O(n^{2}),

Sometimes, we also see propositions like

which is to be understood asS_{0}=S_{1}+ O(T)

we state thatf(n) = 5n^{3}+ 2n^{2}+ O(n)

- We have
`10``n`+100 = O(`n`):Let c := 110 and m:=1 and take arbitrary

`n`>=`m`. We have to show

We know| `10``n`+100| <=`c`*|`n`|.|10* `n`+100| = 10*`n`+100 <= 110`n`= 110|`n`| =`c`|`n`|. - We have
*not*`n`^{2}= O(`n`):We suppose

`n`^{2}= O(`n`) and show a contradiction. By the assumption, we have some`c`in**R**and`m`in**N**with**forall**`n`>=`m`: |`n`^{2}| <=`c`|`n`|.If

`c`< 0, let`k`:=max(1,`m`); then we have`k`>=`m`but

and thus a contradiction.| `k`^{2}| > 0 >`c`|`k`|If

`c`>= 0, let`k`:= max(`m`, ceiling(`c`+2)). Then we have`k`>=`m`but

and thus a contradiction.| `k`^{2}| =`k`^{2}>= ceiling(`c`+2)^{2}>= (`c`+2)^{2}=`c`^{2}+4`c`+ 4> `c`^{2}+3`c`=`c`(`c`+3) >=`c`ceiling(`c`+2) >=`c``k`=`c`|`k`| - We have
`n`^{2}= O(2^{n}):We assume

*not*`n`^{2}= O(2^{n}) and show a contradiction. By assumption, we have ~**exists**`c`in**R**,`m`in**N**:**forall**`n`>=`m`: |`n`^{2}| <=`c`|2^{n}|, i.e.,

We prove**forall**`c`in**R**,`m`in**N**: ~**forall**`n`>=`m`: |`n`^{2}| <=`c`|2^{n}|.

by induction on**forall**`n`>= 0:`n`^{2}<= 2*2^{n}`n`and thus have a contradiction.If

`n`=0, we have`n`^{2}= 0 < 2 = 2*2^{0}= 2*2^{n}.We assume

`n`>= 1 =>`n`^{2}<= 2*2^{n}and show

Assume`n`+1 >= 1 => (`n`+1)^{2}<= 2*2^{n+1}.`n`+1 >= 1. We show( `n`+1)^{2}<= 2*2^{n+1}.If

`n`< 1, then`n`= 0 and( `n`+1)^{2}= (0+1)^{2}= 1 < 4 = 2*2^{0+1}= 2*2^{n+1}.If

`n`>= 1, then, by the induction hypothesis,`n`^{2}<= 2*2^{n}. We then have

(*) The fact 2( `n`+1)^{2}=`n`^{2}+ 2`n`+ 1 <=`n`^{2}+ 2^{n}+ 2^{n}(*)<= 2*2 ^{n}+ 2^{n}+ 2^{n}= 2^{n+1}+ 2^{n+1}= 2(2^{n+1}).`n`<= 2^{n}has to be shown in a separate proof.

It is easy to derive a number of laws for the manipulation of O terms.

S-> O _{n}(S),c+O_{n}(S)-> O _{n}(S),c*O_{n}(S)-> O _{n}(S),O _{n}(S) + O_{n}(S)-> O _{n}(S),O _{n}(O_{n}(S))-> O _{n}(S),O _{n}(S) + O_{n}(T)-> O _{n}(|S|+|T|),O _{n}(S) * O_{n}(T)-> O _{n}(S*T),O _{n}(S) * O_{n}(T)-> S*O_{n}(T),O _{n}(S^{c})-> O _{n}(S^{d}).

Thus we can replace a statement

via the transformationsf(n) = 2n^{2}- 7n+ 5

by the statement

2 n^{2}- 7n+ 5 -> 2*O(n^{2}) - 7*O(n) + 5*O(1) ->O( n^{2}) + O(n) + O(1) -> O(n^{2}) + O(n^{2}) + O(n^{2}) -> O(n^{2})

f(n) = O(n^{2}).

This example illustrates that the asymptotic behavior of a polynomial sequence is asymptotically dominated by the polynomial term with the highest degree.

forallninN,a:N_{n}->R:( sum_{0 <= i <= n}ax^{i}) = O_{n}(x^{n}).

For comparing the asymptotic behavior of classes of real functions, the following notation comes handy.

is a proposition with bound variableS<_{n}T

S= O_{n}(T) /\T!=O_{n}(S).

We then have the following chain of function classes:

1 <_{n}log_{a}(n)<_{n}n<_{n}nlog_{a}(n)<_{n}n^{k}<_{n}c^{n}<_{n}n!

The corresponding function graphs are visualized below.

To get some "feeling" for the growth rate of these functions, assume that we
have a machine that performs 10^{6} operations per second. We are given six
algorithms whose number of execution steps in dependence of the problem size
`n` is determined by one of the six functions denoted above. Then the
maximum size of a problem that can be solved in a fixed amount of time is

Execution Steps 1 second 1 minute 1 hour log _{2}(n)2 ^{106}2 ^{6*107}2 ^{36*108}n10 ^{6}6*10 ^{7}3.6*10 ^{9}nlog_{2}(n)62746 2.8*10 ^{6}1.3*10 ^{8}n^{2}1000 7746 60000 2 ^{n}23 26 32 n!9 11 12

A sorting algorithm that takes, for some logarithmic base,
O(`n`log(`n`)) steps (such as Quicksort)
is therefore much better than an `n`^{2} algorithm (such as Insertion
Sort); the difference in computation time may be not significant for small
inputs but the larger the inputs become, the more the asymptotic complexity
distinguishes the algorithms.

Also we can see that, for algorithms with exponential (or higher) complexity,
the only problems of extremely small size can be solved. Furthermore, it does
not help much to allow say 1000 times more execution time (or use a 1000 times
faster machine or apply a parallel computer with 1000 processors); the
manageable problem size does not increase significantly. Problems for which
only exponential time algorithms exist are therefore called
*intractable (ungefügig)*. For instance, to find out whether a
propositional formula with `n` variables is a tautology is such an
intractable problem.

Author: Wolfgang Schreiner

Last Modification: October 4, 1999