156k views
1 vote
SOMEONE PLEASE ANSWER ASAP!!!!

the formula d=rt relates distance d to rate r and time t. Find how long it takes an airplane to fly 375 miles at 500 miles per hour.

1 Answer

4 votes

Answer: 0.75 hours which is the same as 45 minutes

===========================================================

Step-by-step explanation:

  • d = distance
  • r = rate, which is another term for speed
  • t = time

We're told that the plane flies 375 miles and the speed is 500 miles per hour (mph).

This must mean that d = 375 and r = 500. Let's find t

d = r*t

d/r = t .... divide both sides by r

t = d/r

t = 375/500

t = 0.75

It takes 0.75 hours, which is the same as 45 minutes because

0.75*60 = 45

If the plane flew 500 miles and its speed was 500 mph, then it would take 1 hour, so it makes sense that the plane's time was under an hour if it only traveled 375 miles.

User Aron Nelson
by
6.7k points
Welcome to QAmmunity.org, where you can ask questions and receive answers from other members of our community.