200k views
2 votes
The amplitude of a sound wave is measured in terms of its maximum gauge pressure. By what factor does the amplitude of a sound wave increase if the sound intensity level goes up by 40.0 dB?

a) 10 times
b) 100 times
c) 1000 times
d) 10,000 times

1 Answer

4 votes

Final answer:

The amplitude of a sound wave increases by a factor of 10,000 if the sound intensity level goes up by 40.0 dB.

Step-by-step explanation:

The amplitude of a sound wave is measured in terms of its maximum gauge pressure. The sound intensity level is a logarithmic scale used to measure the perceived loudness of a sound. The formula to calculate the change in sound amplitude is:

Amplitude factor = 10 ^(change in intensity level/ 10)

In this case, if the sound intensity level goes up by 40 dB, the amplitude factor can be calculated as:

Amplitude factor = 10 ^(40/ 10) = 10^4 = 10000

The question at hand is asking about the relationship between sound intensity level in decibels (dB) and the amplitude of a sound wave, specifically how much the amplitude of a sound wave increases if the sound intensity level rises by 40 dB.

Sound intensity level in decibels is a logarithmic scale, meaning that every increase of 10 dB represents a tenfold increase in sound intensity. However, amplitude, which is related to the maximum gauge pressure of the sound wave, increases by a factor of the square root of the intensity.

Therefore, if the sound intensity level increases by 40 dB, the sound intensity itself increases by a factor of 104 (as 40 dB represents four 10-dB increases). Hence, the amplitude of the wave increases by a factor of 102, which is 100 times.

Therefore, the amplitude of the sound wave increases by a factor of 10,000.

User Dreamweaver
by
8.0k points