40.4k views
3 votes
Solve the problem. Unless stated otherwise, assume that the projectile a is ideal, that the launch angle is measured from the horizontal, and that the projectile is launched from the origin over a horizontal surface An ideal projectile is launched from level ground at a launch angle of 26⁰ and an initial speed of 48 m/sec. How far away from the launch point does the projectile hit the ground?

a.=185 m
b.=60 m
c.=290 m
d.=230 m

User Blenikos
by
8.0k points

1 Answer

5 votes

Final answer:

To solve this problem, calculate the time it takes for the projectile to hit the ground using the vertical motion and then calculate the horizontal distance using the horizontal velocity.

Step-by-step explanation:

To solve this problem, we can first calculate the time it takes for the projectile to hit the ground using the vertical motion. The initial vertical velocity can be found by multiplying the initial speed by the sine of the launch angle. Using the formula d = vt + 1/2gt^2, where d is the vertical displacement and g is the acceleration due to gravity, we can plug in the values to find the time. The horizontal distance can be calculated using the horizontal velocity, which is found by multiplying the initial speed by the cosine of the launch angle. Multiplying the horizontal velocity by the time gives us the distance traveled.

Given:

  • Initial speed (v): 48 m/s
  • Launch angle (θ): 26°
  • Acceleration due to gravity (g): -9.8 m/s^2

Calculations:

Vertical velocity (vy) = v * sin(θ)

Time (t) = 2 * vy / g

Horizontal velocity (vx) = v * cos(θ)

Distance (d) = vx * t

By substituting the values and solving the equations, we find:

Time (t) = 2 * (48 * sin(26°)) / -9.8 ≈ 3.098 seconds

Distance (d) = (48 * cos(26°)) * 3.098 ≈ 230 meters

Therefore, the projectile hits the ground approximately 230 meters away from the launch point.

User Peter Kazazes
by
8.3k points