224k views
0 votes
a ball with a mass of 0.08 kg is launched from a table to the floor. the table is 0.87 meters above the floor. if the ball is launched at a 35 degree angle above the horizontal with a speed of 3.2 m/s, how far from the table does it land? hint: ax^2 + bx + c=0

1 Answer

2 votes

Final answer:

To find the horizontal distance the ball travels when launched from a table to the floor, we need to calculate the time of flight using the initial velocity and launch angle. Once we know the time, we can use it to calculate the horizontal distance using the horizontal component of the initial velocity.

Step-by-step explanation:

To solve this problem, we need to break the initial velocity into horizontal and vertical components. The horizontal component of the velocity remains constant throughout the motion, while the vertical component changes due to gravity. Let's start by calculating the time it takes for the ball to reach the floor:

The time of flight can be calculated using the equation:

t = (2 * v0 * sin(theta)) / g

where v0 is the initial velocity (3.2 m/s), theta is the launch angle (35 degrees), and g is the acceleration due to gravity (9.8 m/s^2).

Once we know the time of flight, we can calculate the horizontal distance using the equation:

x = v0 * cos(theta) * t

where x is the horizontal distance. Plugging in the values, we get:

x = 3.2 * cos(35) * t

Finally, we can substitute the value of t back into the equation to find the horizontal distance:

x = 3.2 * cos(35) * (2 * 3.2 * sin(35) / 9.8)

This gives us the horizontal distance from the table to where the ball lands.

User Thiago Baldim
by
7.5k points