228k views
5 votes
A radio signal travels at 3.00 • 10^8 meters per second. How many seconds will it take for a radio signal to travel from a satellite to the surface of Earth if the satellite is orbiting at a height of 3.54 • 10^7 meters? Show your work.

would the answer be rate=3*10^8m/s
distance=3.6*10^7m
time=distance/rate=3.6/30 ???

User Dhananjay
by
8.2k points

1 Answer

3 votes
The answer is 0.118 seconds.

The velocity (v) is the distance (d) divided by time (t):
v = d ÷ t

It is given:
v = 3.00 × 10⁸ meters per second
d = 3.54 × 10⁷ meters

It is unknown:
t = ?

If:
v = d ÷ t
Then:
t = d ÷ v
t = 3.54 × 10⁷ meters ÷ 3.00 × 10⁸ meters/second
t = 1.18
× 10⁻¹ seconds
t = 0.118 seconds

Therefore, radio signal will travel from a satellite to the surface ofEarth 0.118 seconds.
User Fartem
by
8.4k points

No related questions found

Welcome to QAmmunity.org, where you can ask questions and receive answers from other members of our community.

9.4m questions

12.2m answers

Categories