140k views
4 votes
Two archers shoot arrows in the same direction from the same place with the same initial speeds but at different angles. One shoots at 45 degrees above the horizontal, while the other shoots at 60 degrees above the horizontal. If the arrow launched at 45 degrees lands 225 m from the archer, how far apart are the two arrows when they land?

1 Answer

0 votes

Final answer:

The two arrows are approximately 450 * sqrt(2) m apart when they land.

Step-by-step explanation:

To find the distance between the two arrows, we need to find the horizontal distances traveled by each arrow. The horizontal distance traveled by a projectile can be calculated using the equation:

Horizontal distance = initial velocity * time * cosine(angle)

For the arrow shot at 45 degrees, the horizontal distance is 225 m. Let's call the horizontal distances for the two arrows as d1 and d2. The equation for the arrow shot at 45 degrees becomes:

d1 = initial velocity * time * cosine(45)

Similarly, for the arrow shot at 60 degrees:

d2 = initial velocity * time * cosine(60)

We know that d1 = 225 m and we need to find d2. Using the above equations, we can find the value of initial velocity * time and substitute it in the equation for d2:

d2 = (d1 * cosine(45)) / cosine(60)

Substituting the values, we get:

d2 = (225 * sqrt(2)) / (1/2)

d2 = 450 * sqrt(2) m

Therefore, the two arrows are approximately 450 * sqrt(2) m apart when they land.

User Frank Gambino
by
7.3k points