158k views
4 votes
A rock is thrown upward with a vertical speed of 35 m/s and a horizontal speed of 12 m/s on

level ground. How far away does it land?

1 Answer

3 votes

Answer:

R = 85.73 m

Step-by-step explanation:

It is given that,

The vertical speed of a rock,
v_y=35\ m/s

The horizontal speed of a rock,
v_x=12\ m/s

We know that,

The vertical speed,


v_y=v\sin\theta=35\ ....(1)

Horizontal speed,


v_x=v\cos\theta=12\ ....(2)

Dividing equation (1) by (2) such that,


(v_y)/(v_x)=(v\sin\theta)/(v\cos\theta)\\\\(35)/(12)=\tan \theta\\\\\theta=\tan^(-1)((35)/(12))\\\\\theta=71.07^(\circ)

Let v is the initial velocity of the projection. So,


v=√(v_x^2+v_y^2) \\\\v=√(12^2+35^2) \\\\v=37\ m/s

We need to find the distance where it will land i.e. the range of the projectile. It is given by the formula as follows :


R=(v^2\sin2\theta)/(g)\\\\\text{Putting all the values, we get }\\\\R=(37^2* \sin(2* 71.07))/(9.8)\\\\R=85.73\ m

So, it will land at a distance of 85.73 m.

User ChickensDontClap
by
5.5k points