26.6k views
4 votes
Radio signals travel at a rate of 3 × 10^8 meters per second. How many seconds would it take for a radio signal to travel from a satellite to the surface of the Earth if the satellite is orbiting at a height of 4,2 × 10*7 meters?

1 Answer

6 votes

Answer:

0.14 seconds

Explanation:


(4.2*10^(7) m)/(3*10^(8) (m)/(8) ) =1.4*10^(-1)s=0.14s

User Rbajales
by
6.0k points