115k views
3 votes
a pilot flies a plane south and then 600 miles west, where she lands the plane. how far did the pilot fly the plane if she lands 610 miles from the starting point

1 Answer

1 vote
From problem above, the pilot will make a right triangle. First find the length the missing side and then add all lengths to get total length.
Use pythagorean theorem to find the length of a.
a^2+b^2 = c^2
where
a = ?
b= 600 miles
c= 610 miles
substituting the variables, this will give us
a = 110 miles
Total length = 110+600+610
= 1320 miles
The plane flew 1320 miles.




User Artyom
by
7.5k points