78.0k views
4 votes
Radio signals travel at a rate of 3 × 108 meters per second. how many seconds would it take for a radio signal to travel from a satellite to the surface of earth if the satellite is orbiting at a height of 9.6 × 106 meters? (hint: time is distance divided by speed.)

User Bosc
by
5.7k points

1 Answer

4 votes
3.2x10^-2 seconds (0.032 seconds)
This is a simple matter of division. I also suspect it's an exercise in scientific notation, so here is how you divide in scientific notation:

9.6 x 10^6 m / 3x10^8 m/s

First, divide the significands like you would normally.
9.6 / 3 = 3.2

And subtract the exponent. So
6 - 8 = -2

So the answer is 3.2 x 10^-2
And since the significand is less than 10 and at least 1, we don't need to normalize it.

So it takes 3.2x10^-2 seconds for the radio signal to reach the satellite.
User Natig Babayev
by
5.8k points