112k views
5 votes
URGENT PLEASE HELP. A radio signal travels at 3.00 * 10^8 meters per second. how many seconds will it take for a radio signal to travel from a satellite to the surface of Earth if the satellite is orbiting at a height of 3.54 * 10^7 meters? show all the steps that you used to solve this problem.

User Hgolov
by
4.6k points

2 Answers

6 votes
t = d/s

t = 3.54 x 10.7m / 3 x 10^8 m/s

t= 0.118 s

It would take 0.118 seconds for the radio signal to reach earth
User Stefan Luv
by
5.4k points
5 votes
So distance=rate x time and if we rearrange this to find the time we get time=distance/rate. Now we can just plug in the numbers.


3.54*10^7/3*10^8=time(seconds)
To solve this we can just divide the parts of the scientific notation separately.

3.54/3=1.18
10^7/10^8=1/10

Now we just multiply our answers together because we separated them before we divided.

1.18*1/10=.118

So the answer is .118 seconds
User Manjit Kumar
by
4.7k points