Time Complexity to Calculate GCD

Why the time complexity to calculate gcd of a and b is log(max(a, b))? Can anyone elaborate on this to me?

You might want to look at this.

The computational efficiency of Euclid’s algorithm has been studied thoroughly. This efficiency can be described by the number of division steps the algorithm requires, multiplied by the computational expense of each step. The first known analysis of Euclid’s algorithm is due to A. A. L. Reynaud in 1811, who showed that the number of division steps on input (u , v ) is bounded by v ; later he improved this to v /2 + 2. Later, in 1841, P. J. E. Finck showed that the number of division steps is at most 2 log2 v + 1, and hence Euclid’s algorithm runs in time polynomial in the size of the input.

At every step, there are two cases

  • b >= a / 2, then a, b = b, a % b will make b at most half of its previous value
  • b < a / 2, then a, b = b, a % b will make a at most half of its previous value, since b is less than a / 2

So at every step, the algorithm will reduce at least one number to at least half less. In at most O(loga)+O(logb) step, this will be reduced to the simple cases. Which yield an O(logn) algorithm, where n is the upper limit of a and b.