24.4k views
2 votes
A plane flies tge 700 miles from atlanta to Detroit in 1and1/4 hours.what is the plane's average air speed in miles per hour?

1 Answer

2 votes

Answer:

The plane average speed is 560 miles per hour.

Explanation:

This problem can be solved by a rule of three.

In a rule of three problem, the first step is identifying the measures and how they are related, if their relationship is direct of inverse.

When the relationship between the measures is direct, as the value of one measure increases, the value of the other measure is going to increase too.

When the relationship between the measures is inverse, as the value of one measure increases, the value of the other measure will decrease.

In this problem, the measures are:

- The distance

- The time

As the time increases, so does the distance, which means that there is a direct relatioship between the measures.

Solution:

A fourth of a hour is
(1)/(4)*60 = 15 minutes. The plane flew 700 miles in 75 minutes. The problem asks how many miles he traveled in 60 minutes, so:

75 minutes - 700 miles

60 minutes - x miles

75x = 42000


x = (42000)/(75)

x = 560.

The plane average speed is 560 miles per hour.

User Aneta
by
8.1k points

Related questions

asked Jan 17, 2024 910 views
TyMayn asked Jan 17, 2024
by TyMayn
7.8k points
1 answer
3 votes
910 views
asked Jan 7, 2017 118k views
Rabbitco asked Jan 7, 2017
by Rabbitco
7.3k points
1 answer
2 votes
118k views
1 answer
1 vote
83.4k views
Welcome to QAmmunity.org, where you can ask questions and receive answers from other members of our community.