129,443 views
3 votes
3 votes
Radio signals travel at a rate of 3x10^8 meters per second how many seconds would it take for a radio signal to travel from a satellite to the surface of the earth if the satellite is orbiting at a height of 3.6x10^7 meters

User Varunvlalan
by
2.5k points

2 Answers

11 votes
11 votes

Answer:

0.12 seconds

Explanation:

just think - how long does it take you to travel for example 30 km, if you are going 60 km/h ?

you have to divide 30 by 60 and get 0.5 or 1/2. meaning it takes you (logically) 30 minutes or half an hour to do so.

it is the same principle for all these kinds of questions.

we only need to keep an eye on the dimension of what we are talking about. is it hours or seconds ? meters or kilometers ? and do in

we need here to focus on seconds and to calculate

3.6×10⁷ / 3×10⁸ = 3.6 / 3×10¹ = 1.2 / 10 = 0.12 seconds

or


1.2 * {10}^( - 1)

seconds

User Phil Freihofner
by
3.4k points
18 votes
18 votes

Answer:

1.2x10^-1 seconds

Explanation:

User KDoyle
by
2.7k points
Welcome to QAmmunity.org, where you can ask questions and receive answers from other members of our community.