72,572 views
6 votes
6 votes
A signal from a satellite in space travels to the surface of the Earth at the speed of light. The distance between the satellite and the surface of the Earth is 30,425 kilometers. How long does it take for the signal to reach the surface of the Earth, to the nearest thousandth of a second? (Note: Use 3.0 x 10 meters/second for the speed of light. 1 kilometer = 1,000 meters)​

A signal from a satellite in space travels to the surface of the Earth at the speed-example-1
User Aek
by
2.7k points

1 Answer

16 votes
16 votes

Answer:

0.101

Explanation:


speed = (distance)/(time)

We have speed, and we have distance. So let's rearrange to get time:


time = (distance)/(speed)

Let's plug in our numbers (make sure to times distance by 1000 to convert to metres):


time = \frac{30425000}{3.0 * {10}^(8) }

Which leaves us with:


time = \frac{1217}{2 ^ {5} \cdot 3 \cdot 5 ^ {3}} \approx 0.101416667

User Jochen
by
3.1k points