Final answer:
To find the distance from the airport, we can use the concept of vectors and the Pythagorean theorem. The airplane first flies 225 miles northwest, then 150 miles southwest. By adding the components of the vectors and using the Pythagorean theorem, we find that the plane is 261.62 miles from the airport.
Step-by-step explanation:
To find the distance from the airport, we can use the concept of vectors. The airplane first flies 225 miles northwest, which can be represented by a vector A pointing in that direction. The airplane then flies 150 miles southwest, which can be represented by a vector B pointing in that direction. To find the distance from the airport, we need to find the resultant vector of A and B by adding them together.
We can break down vector A into its north and west components, and similarly, vector B into its south and west components. The north and south components cancel each other out, and the west components add up to give us the total westward displacement. Using the Pythagorean theorem, we can find the magnitude of the resultant vector, which is the distance from the airport. The correct answer is c) 261.62 miles.