24.4k views
2 votes
A plane flies tge 700 miles from atlanta to Detroit in 1and1/4 hours.what is the plane's average air speed in miles per hour?

1 Answer

2 votes

Answer:

The plane average speed is 560 miles per hour.

Explanation:

This problem can be solved by a rule of three.

In a rule of three problem, the first step is identifying the measures and how they are related, if their relationship is direct of inverse.

When the relationship between the measures is direct, as the value of one measure increases, the value of the other measure is going to increase too.

When the relationship between the measures is inverse, as the value of one measure increases, the value of the other measure will decrease.

In this problem, the measures are:

- The distance

- The time

As the time increases, so does the distance, which means that there is a direct relatioship between the measures.

Solution:

A fourth of a hour is
(1)/(4)*60 = 15 minutes. The plane flew 700 miles in 75 minutes. The problem asks how many miles he traveled in 60 minutes, so:

75 minutes - 700 miles

60 minutes - x miles

75x = 42000


x = (42000)/(75)

x = 560.

The plane average speed is 560 miles per hour.

User Aneta
by
6.4k points
Welcome to QAmmunity.org, where you can ask questions and receive answers from other members of our community.