14.1k views
4 votes
The formula for distance is d=v*t, where v is the object's velocity and t is the time. How long will it take a plane to fly 4000 miles from Chicago to London if the plane flies at a constant rate of 500 mph? A) 3.5 hours B) 8 hours C) 20 hours D) 45 hours

2 Answers

4 votes

Answer: B) 8 hours

Explanation: Using the formula given d=v*t

then, substitute d=4000miles

v=500 mph

t= ?

4000=500*t

4000/500=500/500 *t

8=t

Therefore, the plane will take 8 hours to fly from Chicago to London at a rate of 500 mph.

User Shaquira
by
7.5k points
4 votes

Answer:

B) 8 hours

Step-by-step explanation:

To calculate how long it will take a plane to fly 4000 miles at a constant rate of 500 mph, you can use the formula:

Distance (d) = Velocity (v) × Time (t)

You are given the distance (d) as 4000 miles, and the velocity (v) is 500 mph. You need to find the time (t). Rearrange the formula to solve for time:

Time (t) = Distance (d) / Velocity (v)

Now, plug in the values:

Time (t) = 4000 miles / 500 mph = 8 hours

So, it will take the plane 8 hours to fly from Chicago to London at a constant rate of 500 mph.

The correct answer is B) 8 hours.

User Linus Thiel
by
7.2k points