40.7k views
16 votes
If the distance from an antenna on Earth to a geosynchronous communications satellite is 20600 miles, given that there are 1.61 kilometers per mile, and radio waves travel at the speed of light (3.0 x 108 meters/sec), how many milliseconds does it take for a signal from the antenna to reach the satellite

User Sander
by
4.3k points

1 Answer

6 votes

Answer:

I wish I know it but I don’t

Explanation:

User Lewis Black
by
3.9k points