140k views
0 votes
A child has a hearing loss of 55 dB near 6000 Hz, due to noise exposure, and normal hearing elsewhere. How much more intense is a 6000 Hz tone than a 400 Hz tone if they are both barely audible to the child?

2 Answers

3 votes

Answer:

The loss of sensitivity gets bigger as one goes to lower frequencies. Also, at very high frequencies sensitivity is again reduced.

Step-by-step explanation:

Noise is measured in units of sound pressure called decibels (dB). The decibel notation is implied any time a "sound level" or "sound pressure level" is mentioned.

Decibels are measured on a logarithmic scale: a small change in the number of decibels indicates a huge change in the amount of noise and the potential damage to a person's hearing.

Frequency, f, is a measure of the number of vibrations (i.e., sound pressure cycles) that occur per second. It is measured in hertz (Hz), where one Hz is equal to one cycle per second.

Sound frequency is perceived as pitch (i.e., how high or low a tone is). The frequency range sensed by the ear varies considerably among individuals. A young person with normal hearing can hear frequencies between approximately 20 Hz and 20,000 Hz. As a person gets older, the highest frequency that he or she can detect tends to decrease.

Human speech frequencies are in the range of 500 Hz to 4,000 Hz. This is significant because hearing loss in this range will interfere with conversational speech. The portions of the ear that detect frequencies between 3,000 Hz and 4,000 Hz are the earliest to be affected by exposure to noise.

User John Paulett
by
3.5k points
0 votes

Answer:

For this question, what you need to do is a rule of three to check how much more intense is 6000Htz.

By doing that you'll get the following result:

400 - 100%

6000 - X%

After that, you need to multiply the numbers, and it'll be like this:

400x = 6000 x 1000

An then you'll get the final result that is: 1500%

User Felix Eve
by
3.2k points