132k views
1 vote
An algorithm takes 5 seconds for input size 200. how long will it take for input size 1000 if the running time is linear (assume low-order terms are negligible).

1 Answer

3 votes
It would take 25 seconds
User Kohloth
by
3.9k points