34.7k views
3 votes
If you traveled one mile at a speed of 100 miles per hour, and another mile at a speed of 1 mile per hour, your average speed would not be (100 mph+1mph)/2 or 50.5 mph. What would be your average speed? (Hint: What is the total distance and total time?)

Okay so.. How would I set up this problem and solve properly? Maybe it's just me, but the way this is worded is throwing me off. Any help is greatly appreciated! <3

User Tom Hazel
by
5.8k points

1 Answer

2 votes

Answer:

The average speed is 1.98 mile per hour.

Step-by-step explanation:

you traveled one mile at a speed of 100 miles per hour

So, the distance travelled by you is 1 mile

The speed is given by,

Speed = distance/ time

time = distance / speed

time = 1 / 100

time = 0.01 hour

Now next mile you travel at speed 1 mile per hour.

The speed is given by,

Speed = distance/ time

time = distance / speed

time = 1 / 1

time = 1 hour

Total distance = 1 mile + 1 mile = 2 mile

Total time = 1 hour + 0.01 hour = 1.01 hour

Average speed is given by,

Average speed =
(total distance)/(total time)

Average speed =
(2)/(1.01)

Average speed = 1.98 mile per hour

The average speed is 1.98 mile per hour.

User Marlin
by
5.8k points