home

# Master theorem

The master theorem provides analysis of Computational complexity for divide-and-conquer Recursive algorithms.

Letting $$T(n)$$ denote the total time for the algorithm to process and input of size $$n$$, and $$f(n)$$ be the time complexity of the top level in the recursion it can be stated $T(n) = a \cdot T \left(\frac{n}{b} \right) + f(n)$ where

• $$a$$ is the number of subproblems in the recursion, and
• $$b$$ is the factor by which the problem space is reduced per recursive step.

For example, traversing a binary tress is modelled by $T(n) = 2 \cdot T\left(\frac{n}{2}\right) + O(n)$

As recursive algorithms may be modelled by a tree, its height is $$\log_{b}n$$ and at depth $$i$$ there are $$a^i$$ nodes. This then implies there are $$a^{\log_{b}n} = n^{\log_{b}a}$$ leaves and therefore a runtime of $$\Theta(n^{\log_{b}a})$$.

Given this, the asymptotic bound $$T(n)$$ may the be one of:

1. If $$f(n) = O(n^{\log_{b}a - \epsilon})$$ for some $$\epsilon > 0$$, then $$T(n) = \Theta(n^{\log_{b}a})$$.
2. If $$f(n) = \Theta(n^{\log_{b}a})$$, then $$T(n) = \Theta(n^{\log_{b}a}\log n)$$.
3. If $$f(n) = \Omega(n^{\log_{b}a + \epsilon})$$ for some $$\epsilon > 0$$, then $$T(n) = \Theta(f(n))$$.