By Stephanie Ellen
, last updated March 13, 2012

The representation of algorithms by recursive programs, or programs that solve a problem by breaking that problem into smaller components, is used in complexity theory. Complexity theory is particularly applicable to computer science, where it's important to know how much run time a particular function might have. Recurrence relations, a relationship where the function T(..) appears on both sides of the equals sign, can be used to calculate the time complexity of recursive functions.

- Count how many arithmetic operations are contained at each stage in the code. For example, the following code contains one multiplication and one subtraction at each stage:int powerA( int x, int n){if (n==0)return 1;if (n==1)return x;elsereturn x * powerA( x , n-1);}
- Count how many operations are required to solve the code for a single value of n to obtain a base equation. In this example, when a value of 1 is input into the code, one operation is needed to execute the function, because the problem size reduces from n to n-1 at every stage. Therefore, the base equation is T(1) = 1.
- Count the number of stages in the code for any number n and then write a general equation. In this sample code, one stage is needed for each subsequent n value. You know from Step 1 that two arithmetic operations are needed for each stage, which gives the following general equation: T(n) = T( n - 1 ) + 2.
- Write down the recurrence relation, a combination of the base equation and the general equation. The following recurrence relation is true for the sample code:T(n) = T( n - 1 ) + 2T(1) = 1
- Solve the equations to express T(n) in terms of n. Using algebra, T(n) = T( n - 1 ) + 2 and T(1) = 1 has the solution T(n) = 2n - 1. This equation represents the time complexity for this particular recursive algorithm.

Top Related Searches