59.1k views
4 votes
The conventional way of multiplying two n-bit integers requires O(n2) time. There are much better ways of multiplying long integers, based on a divide-and-conquer approach. For example, we can break each n-bit integer into n/2-bit halves, and produce the result by making 3 recursive calls to n/2-bit multiplications (which are themselves done in this clever way) and several additions of n/2-bit integers. Since addition of n-bit numbers only requires O(n) time, you get a recurrence for the running time of T(n) = 3T(n/2) + cn for some constant c (which, asymptotically, doesn't matter). This algorithm is sometimes referred to as the "Karatsuba-Ofman (KO) Algorithm." Hypothetically, we can beat the KO approach if we can break our two n-bit numbers into y pieces of length n/y, and perform the multiplication of n-bit numbers by using x recursive multiplications of n/y bit numbers plus any constant number of additions of n/y-bit numbers. What is the relationship between x and y that would make this hypothetical algorithm have a lower running time than the KO Algorithm? Identify one such pair below. Note, because the exact calculation requires the computation of logarithms, you may wish to find and use a scientific calculator of some sort.

User FlashOver
by
6.5k points

1 Answer

5 votes

Answer:

x=28 and y=8

Explanation:

In general for the proposed algorithm in terms of 'x' and 'y' the recurrence relation will result as:

T(n)=xT(n/y)cn;

The objective is to find such a combination of 'x' and 'y' which

KO's results in lower asymptotic complexity than the KO's..

KO's complexity:

T(n)=3T(n/2) cn

Using master's theorem:

a=3,b=2,f(n) = n => c=1

As 1) = log23 = 1.585> (c logia

hence from the first case of master's theorem

User TheProofIsTrivium
by
7.4k points
Welcome to QAmmunity.org, where you can ask questions and receive answers from other members of our community.