179k views
1 vote
a crow flies to a point that is 1 mile east and 20 miles south of its starting point. how far does the crow fly?

2 Answers

6 votes
Make it into a right Triangle then use the pythagorean theorem (a^2+b^2=c^2) to solve for C which will give you your answer.
User Ovm
by
7.6k points
5 votes

Answer:

Crow flew 20.02 miles.

Explanation:

A crow flies to a point in the east = 1 miles

Then it flies to a point in the south = 20 miles

Now we have to calculate the distance x between the crow and initial point.

Therefore, distance x =
\sqrt{(\text{Distance of the crow in the east})^(2)+(\text{Distance of the crow in the south})^(2)}

[By Pythagoras theorem]

x =
\sqrt{1^(2)+(20)^(2)}

=
√(1+400)

=
√(401)

= 20.02 miles

Crow is 20.02 miles away.

a crow flies to a point that is 1 mile east and 20 miles south of its starting point-example-1
User Rousonur Jaman
by
8.9k points

No related questions found

Welcome to QAmmunity.org, where you can ask questions and receive answers from other members of our community.

9.4m questions

12.2m answers

Categories