79.5k views
5 votes
One computer processes a company's payroll in 20 minutes. while another computer takes 32 minutes. How long would it take in minutes to process payroll if both computers were used?

User Mkhoshpour
by
8.2k points

2 Answers

4 votes

Answer: It would take 52 minutes


Explanation:

20+32=52

User Cullen SUN
by
8.8k points
5 votes

Answer:

Time taken would be = 12 minutes to print the payroll if both computers are used

Explanation:

Computer A Takes = 20 minutes

Computer B Takes = 30 minutes

To Find:

Time to print if both were used = ?

Solution :

Now We will first calculate How much work is done by each computer in one minute

Computer A will do = 1/20 Work one minute

Computer B will do = 1/30 work in one minute

Now if both computers are used then

work done by both of them in one minute would be sum of both the computers f the work together

So

it would be

Work done =
(1)/(20) +(1)/(30)

Taking LCM and solving

Work done =
(3+2)/(60)

=
(5)/(60)

=
(1)/(12)

Now to do the whole task Time taken would be reciprocal of Work done in one minute

So

Time taken would be = 12 minutes



User Lachlan
by
8.2k points

No related questions found

Welcome to QAmmunity.org, where you can ask questions and receive answers from other members of our community.

9.4m questions

12.2m answers

Categories