235k views
1 vote
If a ball is dropped near the surface of the earth, then the distance it falls is directly proportional to the square of the time it has fallen. A ball is dropped over the edge of a vertical cliff and falls 39.2 meters in two seconds. Determine the distance (in meters) the ball would have dropped in 3.5 seconds.

User Skynetch
by
6.1k points

2 Answers

3 votes
ABOUT 143.2 meters im pretty sure

User Old Markus
by
6.1k points
3 votes
Answer: 120 m

Step-by-step explanation:

1) That the distance fallen is directly proportion of the square of the time, means:

distance = k * t^2 => k = distance / t^2

2) Use it for the two set of data given:

a) 39.2 m, 2 s => k = 39.2 m / (2 s)^2

b) x, 3.5 s => k = x / (3.5 s)^2

3) Equal the two expressions for k:

39.2 m / (2s)^2 = x / (3.5s)^2

=> x = 30.2 m * (3,5s)^2 / (2s)^2 = 39.2m * 12.25 / 4 = 120.05m = 120 m
User Aenaon
by
6.5k points