99.5k views
2 votes
A computer takes 3x^2 + 2 milliseconds to process a certain program. If the program has 4 lines of static code (this will always be required for the code to run) and x variable lines, what is the average amount of time it takes to process each line?

User Alexnnd
by
6.3k points

1 Answer

3 votes
The average time taken to process each line can be calculated as follows:
average time = total time / number of lines

For this program:
the total time = 3x^2 + 2 milliseconds
number of lines = 4 + x
(4 static lines and x variable lines)

Therefore:
average time for each line = (3x^2 + 2) / (x + 4) milliseconds
User Jaycyborg
by
6.5k points
Welcome to QAmmunity.org, where you can ask questions and receive answers from other members of our community.