140k views
1 vote
Q 17.4: Two speakers both emit sound of frequency 320 Hz, and are in phase. A receiver sits 2.3 m from one speaker, and 2.9 m from the other. What is the phase difference between the two sounds detected by the receiver

User Kabstergo
by
7.2k points

1 Answer

4 votes

Answer:

0.56*λ = 200º

Step-by-step explanation:

  • In order to know the phase difference between the two sounds detected by the receiver, we need first to know the wavelength of the sound.
  • Assuming that the sound wave is a plane wave, there exists a fixed relationship between the speed of sound, the frequency and the wavelength, as follows:


v =\lambda * f

  • Assuming v= 343 m/s, and f = 320 Hz, we can find λ, as follows:


\lambda = (v)/(f) = (343 m/s)/(320 (1/s)) = 1.08 m

  • In order to know the phase difference, we need to know the path difference between both sounds, in units of wavelength:
  • d = 2.9 m - 2.3 m = 0.6 m
  • So, we can the fraction of the wavelength represented by the distance d, as follows:


\Delta\lambda = (d)/(\lambda) =(0.6m)/(1.08m) =0.56

  • As a difference of 1 λ, means that both sounds arrive in phase each other, a difference of 0.56*λ, in degrees, is as follows:


\Delta\theta = 0.56*360deg = 200 deg

User Bmu
by
7.5k points