153k views
6 votes
how do you calculate the average speed of a car that travels one mile at 30 mph and the second mile at 60 mph

User OriHero
by
4.8k points

1 Answer

4 votes

Answer:

40mph

Explanation:

use the speed = distance/time formula.

1. The first car has a speed of 30mph and travels one mile so the equation can be written as 30mph = 1/t.

Rearrange to get t = 1/30

1/30 of an hour is 2 minutes.

2. The second car has a speed of 60 mph and travels one mile so the equation can be written as 60mph = 1/t

Rearrange to get t = 1/60

1/60 of an hour is 1 minute.

The time it took for the total journey is 1 + 2 = 3 minutes

Writing 3 minutes as a fraction of an hour:

3/60 = 1/20

So using the s = d/t formula to calculate the average speed the formula can be written as (the speed is unknown, time is 1/20 and the distance is 2 miles):

Speed = 2/(1/20) = 2x20/1 = 40

Therefore the average speed of that car is 40 mph.

User Jornane
by
4.9k points
Welcome to QAmmunity.org, where you can ask questions and receive answers from other members of our community.