7.1k views
2 votes
A tsunami generated off the coast of Chile in 1990 traveled nearly 6500 miles to the coast of Honolulu in 17 hours. Determine the speed in mi/hr and m/s. If the averagewidth of such waves was 22 m, what was the average frequency of such adevastating tsunami? (Given: 1.0 m/s = 2.24 mi/hr)​

User Swserg
by
7.9k points

1 Answer

4 votes

Answer:

Step-by-step explanation:

To find the speed of the tsunami:

Speed = Distance / Time

Converting the distance from miles to meters and the time from hours to seconds:

Distance = 6500 miles × 1609.34 m/mile = 10,460,210 m

Time = 17 hours × 3600 s/hour = 61,200 s

Speed = 10,460,210 m / 61,200 s ≈ 171 m/s

To convert the speed to miles per hour:

Speed = 171 m/s × 2.24 mi/hr / 1 m/s ≈ 383 mi/hr

So the speed of the tsunami is approximately 171 m/s or 383 mi/hr.

To find the average frequency:

Wave speed = frequency × wavelength

The wavelength is twice the width of the wave:

Wavelength = 2 × 22 m = 44 m

Frequency = wave speed / wavelength

Frequency = 171 m/s / 44 m ≈ 3.9 Hz

So the average frequency of the tsunami is approximately 3.9 Hz.

User Nils Breunese
by
7.8k points