132k views
1 vote
A pilot flies a plane south and then 600 miles west, where she lands the plane. how far south did the pilot fly the plane if she lands 610 miles from her starting point?

1 Answer

7 votes
Connecting all the distances that the plane had covered, we will determine that it forms a right triangle with hypotenuse equal to 610 miles. Using Pythagorean theorem,
a² + b² = c²
Substituting the known values of legs and hypotenuse,
a² + 600² = 610²
The value of a from the equation is 110. Thus, the plane traveled 110 miles south.
User Laurennmc
by
7.5k points