69.4k views
5 votes
Radio signals travel at a rate of 3x10^8 meter per second. How many seconds would it take for a radio signal to travel from a satellite to the surface of Earth if the satellite is orbiting at a height of 9.6*10^6 meters? ( Hint: Time divide by speed.)

A) 3.2*10^2 seconds
B) 3.2*10^-2 seconds
C) 3.13 *10^1seconds
D) 2.88*10^15 seconds

User AlexeyMK
by
8.7k points

1 Answer

2 votes
distance = velocity x time
distance = 9.6 x 10^15 meters
velocity = 3x10^8 meters/second

time = distance / velocity = m / (m/s) = s
User Vinith Almeida
by
8.0k points