194k views
4 votes
radio signals travel at a rate of 3 x 10^8 meters per second. how many seconds will it take for a radio signal to travel from a satellite to the surface of earth if the satellite is orbiting at a height of 3.6 x10^7 meters? Write the answer in scientific notation.

1 Answer

4 votes

Answer:


time=1.2* 10^(-1)\ second

Explanation:

Given:

The speed of the radio is
3* 10^(8) meters per second.

The distance of the satellite from the earth is
3.6* 10^(7) meter

The formula of the speed is


speed = (distance)/(time)

For time the formula is.


time = (distance)/(speed)

put distance and speed value in above equation.


time = (3.6* 10^(7))/(3* 10^(8))


time = (1.2 x 10^(7))/(10^(8))


time=1.2* 10^(7)* 10^(-8)


time=1.2* 10^(7-8)


time=1.2* 10^(-1)\ second

Therefore, the radio will be take the time is
1.2* 10^(-1)\ second or 0.12 seconds.

User KunLun
by
8.6k points