Published on

Recurrence relation - when division strategy differs based on whether n is even or odd

Authors

The idea of divide and conquer is quite appealing. We divide the problem into subproblems that are smaller instances of the original problem, then solve them recursively, and finally, the solutions to the subproblems are combined back to form the solution to the major problem.

When an algorithm contains a recursive call to itself, we can describe the running time by a recurrence equation.

Often, divide and conquer algorithms divide the problem into subproblems by either multiplying by a fraction (for example, n×12n \times \frac{1}{2}, n×23n \times \frac{2}{3}) or subtraction (for example, n1n - 1, n2n - 2). Here, we are going to see an interesting case when an algorithm alters the strategy to break the problem into subproblems based on whether the size of the input is odd or even.

Pseudo code of this kind of algorithm may look like:

function f(n) {
  if (n == 0) return 1
  if (n % 2 == 0) return f(n / 2) + 1
  else return 2 * f(n - 1)
}

Here, we are not interested in the output of the above function but the time complexity of these kinds of algorithms and especially the approach to solve the recurrence equation for algorithms that follow this kind of pattern.

The recurrence equation for the above function can be written as follows:

T(n)={1when n=0T(n2)+1when n is evenT(n1)+1when n is oddT(n) = \begin{cases} 1 & \text{when } n = 0 \\ T\left(\dfrac{n}{2}\right) + 1 & \text{when } n \text{ is even} \\ T(n - 1) + 1 & \text{when } n \text{ is odd} \end{cases}

It means whenever the value of nn is greater than 0 and divisible by 2 (when nn is an even number), the function recursively calls itself with half of the value of nn, and whenever the value of nn is greater than 0 and not divisible by 2 (when nn is an odd number), the function recursively calls itself with n1n - 1.

nn can possibly be any integer greater than or equal to zero.

  • When n=0n = 0, it takes a constant amount of operations.
  • When nn is odd, the algorithm reduces the value of nn and calls recursively with n1n - 1. As nn is an odd number, it is guaranteed that n1n - 1 is an even number.
  • When nn is even, the algorithm reduces the value of nn by halving it and calls recursively with n2\displaystyle\frac{n}{2}. With the evidence of nn being even, we cannot conclude that n2\displaystyle\frac{n}{2} is odd or even. It can be either even or odd. We need more information about nn to determine whether n2\displaystyle\frac{n}{2} is odd or even. Since we are dividing nn by 2, whether n2\displaystyle\frac{n}{2} becomes odd or even depends on nn being a power of 2.
  • When nn is a power of 2, i.e., n=2kn = 2^k, it is obviously an even number and n2\displaystyle\frac{n}{2} is definitely an even number for all n>2n > 2.
n=2kn is an even number for all k>0n = 2^k \Rightarrow \text{n is an even number for all } k > 0
n2=2k2=2k1n2 is an even number for all k>1\dfrac{n}{2} = \dfrac{2^k}{2} = 2^{k - 1} \Rightarrow \dfrac{n}{2} \text{ is an even number for all } k > 1

So, we can conclude that when nn is a power of 2 and greater than 2, i.e., n=2kn = 2^k and k>1k > 1, n2\displaystyle\frac{n}{2} is even; otherwise, n2\displaystyle\frac{n}{2} may be odd.

So, there are four cases for nn:

  1. n=0n = 0
  2. nn is even and a power of 2 n=2kk>1\Rightarrow n = 2^k \quad \forall \, k > 1
  3. nn is even but not a power of 2
  4. nn is odd

Case 2: nn is even and a power of 2

Let n=2kn = 2^k where k>1k > 1.

We have:

T(n)=T(2k)=T(2k2)+1=T(2k1)+1T(n) = T(2^k) = T\left(\dfrac{2^k}{2}\right) + 1 = T(2^{k - 1}) + 1

By applying this recursively:

T(2k1)=T(2k2)+1T(2^{k - 1}) = T(2^{k - 2}) + 1

Continuing this way until we reach T(20)T(2^0):

T(2k)=T(2k1)+1=T(2k2)+2==T(20)+k=T(1)+kT(2^k) = T(2^{k - 1}) + 1 = T(2^{k - 2}) + 2 = \dots = T(2^0) + k = T(1) + k

Assuming T(1)=T(0)+1T(1) = T(0) + 1 and T(0)=1T(0) = 1:

T(1)=1+1=2T(1) = 1 + 1 = 2

Therefore:

T(n)=T(2k)=2+kT(n) = T(2^k) = 2 + k

Since k=log2nk = \log_2 n, we have:

T(n)=2+log2nT(n) = 2 + \log_2 n

Thus, the time complexity for this case is O(logn)O(\log n).

Cases 3 and 4: nn is even but not a power of 2, or nn is odd

When nn is odd:

For nn odd:

T(n)=T(n1)+1T(n) = T(n - 1) + 1

Since n1n - 1 is even, we can then apply:

T(n1)=T(n12)+1T(n - 1) = T\left(\dfrac{n - 1}{2}\right) + 1

Therefore:

T(n)=T(n12)+1+1=T(n12)+2T(n) = T\left(\dfrac{n - 1}{2}\right) + 1 + 1 = T\left(\dfrac{n - 1}{2}\right) + 2

When nn is even but not a power of 2:

For nn even and not a power of 2, we repeatedly apply the recurrence:

T(n)=T(n2)+1T(n) = T\left(\dfrac{n}{2}\right) + 1

However, n2\displaystyle\frac{n}{2} may be odd or even. We continue this process, reducing nn until it becomes either 1 or 0.

Generalization Using Binary Representation

An effective way to analyze the recurrence is by considering the binary representation of nn.

  • Each time we halve nn (when nn is even), it corresponds to a right shift in binary.
  • Each time we subtract 1 (when nn is odd), it flips the least significant bit from 1 to 0.

We can model the total time T(n)T(n) as the sum of:

  • The number of times we halve nn: equal to the number of bits in nn.
  • The number of times we subtract 1: equal to the number of ones in the binary representation of nn.

Let:

  • BitLength(n)=log2n+1\text{BitLength}(n) = \lfloor \log_2 n \rfloor + 1
  • HammingWeight(n)=number of ones in binary representation of n\text{HammingWeight}(n) = \text{number of ones in binary representation of } n

Then, the total time complexity is:

T(n)=c1×BitLength(n)+c2×HammingWeight(n)T(n) = c_1 \times \text{BitLength}(n) + c_2 \times \text{HammingWeight}(n)

for some constants c1c_1 and c2c_2.

Since both BitLength(n)\text{BitLength}(n) and HammingWeight(n)\text{HammingWeight}(n) are O(logn)O(\log n), we have:

T(n)=O(logn)T(n) = O(\log n)

Conclusion

The time complexity of the algorithm is O(logn)O(\log n).

Key Takeaways:

  • The recurrence alters its strategy based on whether nn is even or odd.
  • By analyzing the binary representation of nn, we can determine the time complexity.
  • The total time complexity depends on both the length of nn in binary and the number of ones in its binary representation.
  • The algorithm runs in logarithmic time relative to the input size nn.

Final Result:

T(n)=O(logn)T(n) = O(\log n)