161k views
5 votes
A projectile is launched at an angle of 30 degrees with the horizontal and a speed of 30 m/s. How much time does it spend in the air?

A.)
2.7 s

B.)
1.5 s

C.)
3.1 s

D.)
1.8 s

User OC Rickard
by
8.2k points

1 Answer

3 votes

Answer:

3.53 s

Step-by-step explanation:

The projectile is launched with a horizontal speed Vx = 30 m/s. This is a component of the speed vector of the projectile. The angle is 30 degrees, so we can calculate the vertical component:

tg(a) = Vy/Vx

Vy = tg(a) * Vx

Vy = tg(30) * 30 = 17.3 m/s

Now, since the projectile is at free falit is only affected by the acceleration of gravity, therefore we can say it is at constant acceleration and we can use this equation:

Y(t) = Y0 + Vy0 * t * 1/2 * a * t^2

In this case the projectile is shot from the ground, so Y0 = 0.

a is the gravity, -9.81 m/s^2 (negative because it points down)

So we end up with

Y(0) = 0 + 17.3 * t + 1/2 * (-9.81) * t^2

If we equal this to zero we can find the moments it is at zero height, these will be the moment it was shot (t=0) and the moment it hit the ground. The difference between these is the time it spent on the air.

0 = 17.3 * t - 4.9 * t^2

0 = t * (17.3 - 4.9 * t)

t = 0 is one of the solutions as expected

0 = 17.3 - 4.9 * t

4.9 * t = 17.3

t = 17.3/4.9 = 3.53 s

This is the time when it hit the ground, then

3.53 - 0 = 3.53 s

This is the time it spend on the air

User Newshorts
by
7.1k points

No related questions found

Welcome to QAmmunity.org, where you can ask questions and receive answers from other members of our community.