75.4k views
5 votes
2) An algorithm that takes in as input an array with n rows and m columns has a run time of O(nlgm). The algorithm takes 173 ms to run in an input array with 1000 rows and 512 columns. How long will the algorithm take to run on an input array with 1500 rows and 4096 columns? (Note: For ease of calculation, please use a base of 2 for your logarithm.)

1 Answer

1 vote

Answer:

The algorithm takes 346 ms to run on an input array with 1500 rows and 4096 columns.

Step-by-step explanation:

For an input array with 1000 rows and 512 columns, the algorithm takes 173 ms.

We want to find out how long will the algorithm take to run on an input array with 1500 rows and 4096 columns?

Let the algorithm take x ms to run 1500 rows and 4096 columns.

For an input of n rows and m columns, it takes


n \: log_(2) \:m

So,


1000 \: log_(2) \:512 = 173 \\\\1000 \: log_(2) \:2^(9) = 173 \:\:\: ( 2^(9) = 512) \\\\1000 * 9 = 173 \:\:\:\:\: \: \: eg. 1

and


1500 \: log_(2) \:4096 = x \\\\1500 \: log_(2) \:2^(12) = x \:\:\: ( 2^(12) = 4096) \\\\1500 * 12 = x \:\:\:\:\: \: \: eg. 2

Now divide the eq. 2 by eq. 1 to find the value of x


(1500 * 12)/(1000 * 9) = (x)/(173) \\\\(18000 )/(9000) = (x)/(173) \\\\2 = (x)/(173) \\\\x = 2 * 173 \\\\x = 346 \: ms

Therefore, the algorithm takes 346 ms to run on an input array with 1500 rows and 4096 columns.

User Sjmeverett
by
7.3k points