183k views
1 vote
What is the required speed for an arrow fired at a 45.0° angle to land 90.4 m away? (unit = m/s)

a) 31.9 m/s
b) 45.0 m/s
c) 63.8 m/s
d) 90.4 m/s

1 Answer

4 votes

Final answer:

To find the required speed for an arrow fired at a 45.0° angle to land 90.4 m away, we can use the equations of projectile motion. The required speed is approximately 31.9 m/s.

Step-by-step explanation:

To find the required speed for an arrow fired at a 45.0° angle to land 90.4 m away, we can use the equations of projectile motion. The horizontal distance traveled by the arrow is equal to the time of flight multiplied by the horizontal component of the initial velocity. The vertical distance traveled by the arrow can be determined using the equation for vertical displacement of a projectile.

In this case, the horizontal distance is 90.4 m and the angle of projection is 45.0°. Using the equation d = vt, where d is the horizontal distance, v is the horizontal component of the initial velocity, and t is the time of flight, we can solve for v. Using the equation for vertical displacement y = v0y*t + (1/2)gt^2, where y is the vertical displacement, v0y is the vertical component of the initial velocity, g is the acceleration due to gravity, and t is the time of flight, we can solve for t. Substituting the value of t into the equation for the horizontal distance, we can solve for v.

The required speed for the arrow fired at a 45.0° angle to land 90.4 m away is approximately 31.9 m/s. Therefore, the correct answer is option a) 31.9 m/s.

User Kurohige
by
7.5k points