134k views
12 votes
1. Electromagnetic waves travel at a rate of 3 x 108 meters per second. How many seconds would it

take for an electromagnetic wave to travel from a satellite to the surface of the Earth if the satellite
is orbiting at a height of 2.9 x 109 meters? (Hint: Time is distance divided by rate)

User Oldman
by
3.3k points

1 Answer

9 votes

Answer:

0.975617283951

Explanation:

first, we solve both equations.

electromagnetic waves travel at a rate of 324 meters per second.

A satellite is orbiting the earth at a height of 316.1

the problem told us that time=distance/rate (AKA: speed)

the rate(speed) is 324 meters per second. the distance is 316.1

now we figure out 316.1/324=0.975617283951

User Electricsheep
by
3.7k points