Answer:
The 99% confidence interval for the true average speed of a thunderstorm in his area is between 13.56 miles per hour and 16.44 miles per hour.
Explanation:
We have the standard deviation for the sample, which means that the t-distribution is used to solve this question.
The first step to solve this problem is finding how many degrees of freedom, we have. This is the sample size subtracted by 1. So
df = 13 - 1 = 12
99% confidence interval
Now, we have to find a value of T, which is found looking at the t table, with 12 degrees of freedom(y-axis) and a confidence level of
. So we have T = 3.0545
The margin of error is:

In which s is the standard deviation of the sample and n is the size of the sample.
The lower end of the interval is the sample mean subtracted by M. So it is 15 - 1.44 = 13.56 miles per hour
The upper end of the interval is the sample mean added to M. So it is 15 + 1.44 = 16.44 miles per hour.
The 99% confidence interval for the true average speed of a thunderstorm in his area is between 13.56 miles per hour and 16.44 miles per hour.