223k views
2 votes
Radio signals travel at a rate of 3x10^8 meters per second how many seconds would it take for a radio signal to travel from a satellite to the surface of the earth if the satellite is orbiting at a height of 3.6x10^7 meters

2 Answers

4 votes

Answer:

0.12 seconds

Explanation:

just think - how long does it take you to travel for example 30 km, if you are going 60 km/h ?

you have to divide 30 by 60 and get 0.5 or 1/2. meaning it takes you (logically) 30 minutes or half an hour to do so.

it is the same principle for all these kinds of questions.

we only need to keep an eye on the dimension of what we are talking about. is it hours or seconds ? meters or kilometers ? and do in

we need here to focus on seconds and to calculate

3.6×10⁷ / 3×10⁸ = 3.6 / 3×10¹ = 1.2 / 10 = 0.12 seconds

or


1.2 * {10}^( - 1)

seconds

User LittleO
by
5.9k points
4 votes

Answer:

1.2x10^-1 seconds

Explanation:

User Richard Corden
by
4.6k points
Welcome to QAmmunity.org, where you can ask questions and receive answers from other members of our community.