79.2k views
4 votes
Radio signals travel at a rate of 3x10^8 meters per second. how many seconds will it take for a radio signal to travel from a satellite to the surface of the Earth if the satellite is orbiting at a height of 3.6x10^7

1.) 8.3 seconds
2.) 10.8 x 10^15 seconds
3.) 1.2 x 10^-1 seconds
4.) 1.08 x 10^16 seconds

thank you!! c:

2 Answers

3 votes
time = distance/speed


T = (3.6 * 10^7)/(3 * 10^8)= \boxed{1.2 * 10^(-1) sec}
User GrantTheAnt
by
7.7k points
5 votes

Answer:

Option 3

The time taken for a radio signal to travel from a satellite to the surface of the Earth is
1.2*10^(-1)

Explanation:

Given : Radio signals travel at a rate of
3*10^8 meters per second. The satellite is orbiting at a height of
3.6*10^7.

To find : How many seconds will it take for a radio signal to travel from a satellite to the surface of the Earth?

Solution :

The speed of radio signal is
3*10^8 meters per second.

The distance or height of satellite orbiting is
3.6*10^7.


\text{Time}=\frac{\text{Distance}}{\text{Speed}}

Substitute the value in the formula,


\text{Time}=(3.6*10^7)/(3*10^8)


\text{Time}=1.2*10^(-1)

Therefore, Option 3 is correct.

The time taken for a radio signal to travel from a satellite to the surface of the Earth is
1.2*10^(-1)

User Cronburg
by
8.7k points

No related questions found