104k views
5 votes
Radio signals travel at a rate of 3 × 108 meters per second. How many seconds will it take for a radio signal to travel from a satellite to the surface of the Earth if the satellite is orbiting at a height of 3.6 × 107 meters?

User Hebe
by
6.0k points

2 Answers

4 votes

Answer:

The correct answer is 0.12 seconds.

Explanation:

To solve this problem, you can to use the rule of three.

3X10⁸ meters → 1 sec.

3.6x10⁷ meters → x sec.

Now, you have to multiply 3.6x10⁷ meters by 1 sec, then divide by 3x10⁸ meters:


x = (3.6\cdot10^7 m * 1s )/(3\cdot10^8m)

// m divided by m is 1, rewrite the numbers.


x = (36000000s)/(300000000)

//cancel the zeros.


x = (36s )/(300) = 0.12s

User Niko Gamulin
by
6.2k points
4 votes

(3.6\cdot 10^(7)\,m)/(3\cdot 10^(8)\,m/s)=0.12\,s

It takes 0.12 seconds for the radio signal to reach the surface from the satellite.
User Tika
by
5.7k points