76.6k views
2 votes
For the first 50 mi of a 100 mi trip, a driver travels at 40 mph. What speed would the driver have to travel for the last half of the trip so that the average speed for the entire trip would be 45 mph

User Kevin Read
by
5.5k points

1 Answer

5 votes

Answer:

The driver should travel at a Speed of 50 mph so that the average speed of the entire trip is 45 mph

Explanation:

Given:

Total distance=100 mi

Speed at which the driver travels the first 50 mi= 40 mph

To Find:

Speed at which the driver should travel so that the average speed of the entire trip is 45 mph=?

Solution:

Average speed:

The average speed of an object is the total distance travelled by the object divided by the elapsed time to cover that distance. It's a scalar quantity which means it is defined only by magnitude. A related concept, average velocity, is a vector quantity. A vector quantity is defined by magnitude and direction.

Let X be the distance at which driver should travel so that the average speed of the entire trip is 45 mph

From The given data we get,


(40+X)/(2)=45


40 +X=45*2


40+X=90


X=90-40

X=50

User Willem Jiang
by
4.8k points
Welcome to QAmmunity.org, where you can ask questions and receive answers from other members of our community.