71.6k views
3 votes
Parallel processing that uses two or more computers working together to solve a single problem using parallel processing two computers can solve a problem in 20 minutes so that's a together amount together is 20 minutes if working alone one computer can solve a problem in nine minutes less than the time needed by the second computer. How long will it take the faster computer working alone to solve the problem?

User Bezejmeny
by
6.4k points

1 Answer

4 votes

Let x = amount of time, in min, the slower computer can do the task (on its own)

We can write:


(1)/(x-9)+(1)/(x)=(1)/(20)

Note: we equated the rates of both. Time and rates are reciprocal of each other.

Let's solve this equation:


\begin{gathered} (1)/(x-9)+(1)/(x)=(1)/(20) \\ (1(x)+1(x-9))/(x(x-9))=(1)/(20) \\ (x+x-9)/(x^2-9x)=(1)/(20) \\ (2x-9)/(x^2-9x)=(1)/(20) \\ 1(x^2-9x)=20(2x-9) \\ x^2-9x=40x-180 \\ x^2-49x+180=0 \\ (x-45)(x-4)=0 \\ x=4,45 \end{gathered}

Recall,

x is time of slower computer

x - 9 is time of faster computer

If we take x = 4, "x - 9" becomes negative. So, this solution wouldn't be possible. We take x = 45 as our answer.

So,

faster computer time = x - 9 = 45 - 9 = 36 minutes

User PaFi
by
5.7k points