80.9k views
4 votes
An algorithm takes 0.5 seconds to run on an input of size 100. How long will it take to run on an input of size 1000 if the algorithm has a running time that is linear? quadratic? log-linear? cubic?

User Fasoeu
by
4.1k points

1 Answer

5 votes

Answer:

linear: 5s

quadratic: 50s

log-linear: 0.75 s

cubic: 500s

Explanation:

Let
t_1,t_2 be the running time associated with the input of sizes
s_1,s_2

If the running time is linear


t_2 = t_1(s_2)/(s_1) = 0.5*(1000)/(100) = 0.5*10 = 5s

If the running time is quadratic


t_2 = t_1\left((s_2)/(s_1)\right)^2 = 0.5*\left((1000)/(100)\right)^2 = 0.5*10^2 = 50s

If the running time is log-linear


t_2 = t_1(log(s_2))/(log(s_1)) = 0.5*(log(1000))/(log(100)) = 0.5*1.5 = 0.75s

If the running time is cubic:


t_2 = t_1\left((s_2)/(s_1)\right)^3 = 0.5*\left((1000)/(100)\right)^3 = 0.5*10^3 = 500s

User Mike Upjohn
by
4.3k points