467,216 views
7 votes
7 votes
Parallel processing that uses two or more computers working together to solve a single problem using parallel processing two computers can solve a problem in 20 minutes so that's a together amount together is 20 minutes if working alone one computer can solve a problem in nine minutes less than the time needed by the second computer. How long will it take the faster computer working alone to solve the problem?

User Francesco Casula
by
2.7k points

1 Answer

14 votes
14 votes

Let x = amount of time, in min, the slower computer can do the task (on its own)

We can write:


(1)/(x-9)+(1)/(x)=(1)/(20)

Note: we equated the rates of both. Time and rates are reciprocal of each other.

Let's solve this equation:


\begin{gathered} (1)/(x-9)+(1)/(x)=(1)/(20) \\ (1(x)+1(x-9))/(x(x-9))=(1)/(20) \\ (x+x-9)/(x^2-9x)=(1)/(20) \\ (2x-9)/(x^2-9x)=(1)/(20) \\ 1(x^2-9x)=20(2x-9) \\ x^2-9x=40x-180 \\ x^2-49x+180=0 \\ (x-45)(x-4)=0 \\ x=4,45 \end{gathered}

Recall,

x is time of slower computer

x - 9 is time of faster computer

If we take x = 4, "x - 9" becomes negative. So, this solution wouldn't be possible. We take x = 45 as our answer.

So,

faster computer time = x - 9 = 45 - 9 = 36 minutes

User Guilherme
by
2.8k points
Welcome to QAmmunity.org, where you can ask questions and receive answers from other members of our community.