118k views
1 vote
) Suppose algorithm A takes 10 seconds to handle a data set of 1000 records. Suppose the algorithm A is of complexity O(n2). Answer the following: (i) Approximately how long will it take to handle a data set of 1500 records? Why? (ii) How long will it take to handle a data set of 5000 records? Can you come up with a reason why this will not be entirely accurate but will just be an approximation?

User Bahu
by
5.3k points

1 Answer

3 votes

Answer:

i) The time taken for 1500 records = 15 seconds.

ii) The time taken for 1500 records = 50 seconds.

Step-by-step explanation:

A is an O(n) algorithm.

An algorithm with O(n) efficiency is described as a linearly increasing algorithm, this mean that the rate at which the input increases is linear to the time needed to compute that algorithm with n inputs.

From the question, it is given that for 1000 records, the time required is: 10 seconds.

Algorithm time taken is O(n)

Hence,

1) For 1,500 records

=> 10/1000 = x/1500

=> 10x1500/1000 = x

x = 15 seconds

Thus, the time taken for 1500 records = 15 seconds.

2) For 5,000 records

=> 10/1000 = x/5000

=> 10x5000/1000 = x

x = 50 seconds

Thus, the time taken for 1500 records = 50 seconds.

User Rubek Joshi
by
4.4k points