181k views
13 votes
Calculate the average speed of an airplane that traveled 1,100 miles in 3.5 hours

User Yadnesh
by
3.5k points

1 Answer

5 votes

Answer:

314.29 m/hrs

Step-by-step explanation:

v = d / t

v = 1100 m / 3.5 hrs

v = 314.29 m/hrs

Note that I around the answer to the nearest hundredth. Hope this helps, thank you :) !!

User Awmleer
by
3.6k points