73.5k views
1 vote
A plane left Chicago at 8 A.M. At 12 P.M., the plane landed in Los Angeles, which is 1,700 miles away.

If the plane travels at a constant rate, how many miles did it travel per hour?

User Sarim Sidd
by
9.1k points

1 Answer

2 votes

Commercial passenger flights always list the time AT THE PLACE where
the event took place. The data you gave in your question means that the
flight departed ORD at 8:00 AM Central (Chicago) Time, and arrived LAX
at 12:00 PM Pacific (Los Angeles) time.

Since the trip spanned two time zones, it was actually in the air for 6 hours.

Average speed = (distance) / (time to cover the distance)

= (1,700 miles) / (6 hours) = (283 and 1/3) miles per hour.

====================================

But that wasn't what you had in mind, was it.

You meant that the flight took 4 hours.

In that case, the average speed was

(1,700 miles) / (4 hours) = 425 miles per hour.

This is a much more reasonable average speed for a long haul
passenger jet airliner.

User JaakkoK
by
8.8k points
Welcome to QAmmunity.org, where you can ask questions and receive answers from other members of our community.

9.4m questions

12.2m answers

Categories