105k views
5 votes
Suppose that on a particular computer, it takes the merge sort algorithm a total of 60 seconds to sort an array with 60,000 values. approximately how long will it take the algorithm to sort an array with 120,000 values? round to the nearest second.

User TMB
by
8.2k points

1 Answer

5 votes
120 seconds is the answer. divide 120,000 by 60,000 get 2. 60 times 2 = 120
User QTom
by
8.7k points
Welcome to QAmmunity.org, where you can ask questions and receive answers from other members of our community.

9.4m questions

12.2m answers

Categories