112k views
2 votes
radio signals emitted from and received by an airplane have her fruit frequency of 3.00 * 10 to the 12th Hertz and travel at the speed of light in a vacuum how long is the delay of each message going from the control tower to a jet flying at 1.00 x 10 to the 4th meters of altitude

User Monkeyboy
by
5.1k points

1 Answer

2 votes

here we know that the speed of the signal is same as the speed of light

so here we will have


v = 3 * 10^8 m/s

the altitude of the airplane is given as


h = 1 * 10^4 m

now we know that time taken by the signal to reach the control tower is given as


t = (d)/(v)

now it is given as


t = (1 * 10^4 )/(3 * 10^8)


t = 3.33 * 10^(-5) s

so above is the time taken by the signal to reach the control tower

User Chris Vdp
by
5.9k points