186k views
0 votes
Radio signals travel at a rate of 3 × 10^8 meters per second. How many seconds will it take for a radio signal to travel from a satellite to the surface of the Earth if the satellite is orbiting at a height of 3.6 × 10^7 meters?

10.8 × 10^15 seconds

1.2 × 10^–1 seconds

1.08 × 10^16 seconds

8.3 seconds

User Skalta
by
4.9k points

2 Answers

4 votes

Answer:

1.2 ×
10^(-1) of a second

Explanation:

d = vt , t = d / v

t = ( 3.6 ×
10^(7) )( 3 ×
10^(8) ) = (3.6 ÷ 3) × (
10^(7) ÷
10^(8) ) = 1.2 ×
10^(7 - 8) = 0.12 of second

t = 1.2 ×
10^(-1) of a second

User Nivir
by
4.4k points
4 votes

Answer:

b or 1.2*10⁻1

Explanation:

10^8=one hundred million and that *3=three hundred million

10^7=ten million*3.6=36 million.

it has to be about 0.1 seconds so b

User Parktomatomi
by
3.6k points