69.7k views
4 votes
An astronaut drops a rock into a crater on the moon. The distance, d(t), in meters, the rock travels after t seconds can be modeled by the function d(t)=0.8t². What is the average speed, in meters per second, of the rock between 5 and 10 seconds after it was dropped?

User Daverocks
by
3.3k points

1 Answer

1 vote

Answer:

13.5m/s

Explanation:

We are asked to calculate the average speed between 5 and 10 seconds of the rock.

Generally, average speed is calculated by the total distance divided by the total time

let’s start with time t = 5 seconds

The distance traveled here is: 0.8(5)^2 = 12.5m

For the ten seconds, t = 10 seconds

The distance traveled here is: 0.8(10)^2 = 80m

The average speed would thus be (80 -12.5)/(10 -5) = 67.5/5 = 13.5m/s

User Xenocyon
by
2.9k points