46.3k views
4 votes
Radio signals travel at a rate of 3 × 10^8 meters per second. How many seconds would it take for a radio signal to travel from a satellite to the surface of the Earth if the satellite is orbiting at a height of 7.5 × 10^6 meters?

1 Answer

2 votes

This question is an example of d = r * t

r = rate

r = 3*10^8 m/s

t = ??

d = 7.5 * 10^6 meters.


Sub and solve

d = r * t

7.5 * 10^6 = 3 * 10^8 * t Divide both sides by 3 * 10^8


7.5 * 10^6 / (3 * 10^8) = t

2.5 * 10^(6 - 8) = t

2.5 * 10^-2 seconds = t or

0.025 seconds = t

User Hadas Zeilberger
by
6.6k points
Welcome to QAmmunity.org, where you can ask questions and receive answers from other members of our community.