94.9k views
3 votes
The time required to deliver and install a computer at a customer's location is t= 4 + d over r, where t is time in hours, d is the distance, in miles, from the warehouse to the customer's location, and r is the average speed of the delivery truck. If it takes 6.2 hours for the employee to deliver and install a computer for a customer located 100 miles from the warehouse, what is the average speed of the delivery truck?

1 Answer

1 vote

t=4+(d)/(r) \\ \\ t=6.2 \ [h] \\ d=100 \ [miles] \\ \\ 6.2=4+(100)/(r) \\ 6.2-4=(100)/(r) \\ 2.2=(100)/(r) \\ 2.2r=100 \\ r=(100)/(2.2) \\ r=45 (5)/(11) \\ r \approx 45.45 \ [(miles)/(h)]

The average speed of the delivery truck is about 45.45 miles per hour.
User Quirk
by
7.8k points

No related questions found

Welcome to QAmmunity.org, where you can ask questions and receive answers from other members of our community.

9.4m questions

12.2m answers

Categories