 # binary GCD

(algorithm)

Definition: Compute the greatest common divisor of two integers, u and v, expressed in binary. The run time complexity is O((log2 u v)²) bit operations.

Note: Discovered by J. Stein in 1967. Another source says discovered by R. Silver and J. Tersian in 1962 and published by G. Stein in 1967.

The algorithm uses the following observations.

• If u and v are both even, gcd(u,v) = 2 gcd(u/2, v/2).
• If u is even and v is odd, gcd(u,v) = gcd(u/2, v).
• Otherwise both are odd, and gcd(u,v) = gcd(|u-v|/2, v). (Euclid's algorithm with a division by 2 since the difference of two odd numbers is even.)

Here is the algorithm. It is especially efficient for operations on binary representations.

1. g = 1
2. while u is even and v is even
• u = u/2 (right shift)
• v = v/2
• g = 2*g (left shift)
now u or v (or both) are odd
3. while u > 0
• if u is even, u = u/2
• else if v is even, v = v/2
• else
• t = |u-v|/2
• if u < v, then v = t else u = t
4. return v*g

Knuth calls this algorithm 4.5.2B in [Knuth97, 2:338]. Torsten Eichstädt suggested incrementing g in step 2.3 (g++) and returning v left shifted g times in step 4 (return v<<g). 4 April 2006. Alex Peyser pointed out that g must start at 0. 10 July 2008.

Author: PEB