Skip to main content

Section 4.3 The Big “Oh” and Little “Oh” Notations

Let \(f:\posints\longrightarrow \reals\) and \(g:\posints\longrightarrow\reals\) be functions. We write \(f=O(g)\text{,}\) and say \(f\) is “Big Oh” of \(g\text{,}\) when there is a constant \(c\) and an integer \(n_0\) so that \(f(n)\le cg(n)\) whenever \(n>n_0\text{.}\) Although this notation has a long history, we can provide a quite modern justification. If \(f\) and \(g\) both describe the number of operations required for two algorithms given input size \(n\text{,}\) then the meaning of \(f=O(g)\) is that \(f\) is no harder than \(g\) when the problem size is large.

We are particularly interested in comparing functions against certain natural benchmarks, e.g., \(\log\log n\text{,}\) \(\log n\text{,}\) \(\sqrt{n}\text{,}\) \(n^\alpha\) where \(\alpha\lt 1\text{,}\) \(n\text{,}\) \(n^2\text{,}\) \(n^3\text{,}\) \(n^c\) where \(c>1\) is a constant, \(n^{\log n}\text{,}\) \(2^n\text{,}\) \(n!\text{,}\) \(2^{n^2}\text{,}\) etc.

For example, in Subsection 3.5.2 we learned that there are sorting algorithms with running time \(O(n\log n)\) where \(n\) is the number of integers to be sorted. As a second example, we will learn that we can find all shortest paths in an oriented graph on \(n\) vertices with non-negative weights on edges with an algorithm having running time \(O(n^2)\text{.}\) At the other extreme, no one knows whether there is a constant \(c\) and an algorithm for determining whether the chromatic number of a graph is at most three which has running time \(O(n^c)\text{.}\)

It is important to remember that when we write \(f=O(g)\text{,}\) we are implying in some sense that \(f\) is no bigger than \(g\text{,}\) but it may in fact be much smaller. By contrast, there will be times when we really know that one function dominates another. And we have a second kind of notation to capture this relationship.

Let \(f:\posints\longrightarrow \reals\) and \(g:\posints\longrightarrow\reals\) be functions with \(f(n)>0\) and \(g(n)>0\) for all \(n\text{.}\) We write \(f=o(g)\text{,}\) and say that \(f\) is “Little oh” of \(g\text{,}\) when \(\lim_{n\rightarrow\infty}f(n)/g(n)=0\text{.}\) For example \(\ln n=o(n^{.2})\text{;}\) \(n^\alpha=o(n^{\beta})\) whenever \(0\lt \alpha\lt \beta\text{;}\) and \(n^{100}=o(c^n)\) for every \(c>1\text{.}\) In particular, we write \(f(n)=o(1)\) when \(\lim_{n\rightarrow\infty}f(n)=0\text{.}\)