149k views
4 votes
An airplane flies 750 miles due west in 1 1! 2hours and 750 miles due south in 2 hours. what is the average speed of the airplane

1 Answer

2 votes
Use the Pyth. Theorem to calculate the total distance flown by the plane in 3 1/2 hours:

d = sqrt(750^2 + 750^2) = 750sqrt(2) miles

Now divide this distance by the total time (3 1/2 hours):

750sqrt(2) miles
---------------------- = 303 miles per hour average
3.5 hours
User Pablo Romeo
by
7.7k points