48.5k views
4 votes
Note: Enter your answer and show all the steps that you use

to solve this problem in the space provided.
A radio signal travels at 3.00 · 100 meters per second.
How many seconds will it take for a radio signal to travel from
a satellite to the surface of Earth if the satellite is orbiting at a
height of 3.54 · 10' meters? Show your work.
لیا

Note: Enter your answer and show all the steps that you use to solve this problem-example-1

1 Answer

7 votes

Answer:

O.118 seconds will be taken for a radio signal to travel from a satellite to the surface of earth.

Explanation:

We are given Speed = 3.00 · 10^8 meters per second.

And Distance = 3.54.10^7 metwers

We need to find time.

We know that,

Distance = Speed * Time

3.54.10^7 = 3.00 · 10^8 * Time

Time = 3.54.10^7 / 3.00 · 10^8

Time = 1.18 X 10^7-8

Time = 1.18 x 10^-1

Time = 0.118 seconds.

So, O.118 seconds will be taken for a radio signal to travel from a satellite to the surface of earth.

User Maciej Oczko
by
5.8k points