216k views
1 vote
A sports marketing company is interested in how many hours teenagers in a town spend watching sports. They randomly select 40 teenagers in this town and ask how many hours per week they spend watching sports. The mean amount is 6.25 hours with a standard deviation of 4.33 hours. Which of the following is the 90% confidence interval for the true mean amount of time teenagers from this town watch sports?

Find the t-table here.

(4.396, 8.104)
(4.865, 7.635)
(5.097, 7.404)
(5.245, 7.256)

User DrSAR
by
4.3k points

1 Answer

5 votes

Answer:

(5.097, 7.404)

Explanation:

We have the standard deviation for the sample, which means that the t-distribution is used to solve this question.

The first step to solve this problem is finding how many degrees of freedom, we have. This is the sample size subtracted by 1. So

df = 40 - 1 = 39

90% confidence interval

Now, we have to find a value of T, which is found looking at the t table, with 39 degrees of freedom(y-axis) and a confidence level of
1 - (1 - 0.9)/(2) = 0.95. So we have T = 1.6849

The margin of error is:


M = T(s)/(√(n)) = 1.6849(4.33)/(√(40)) = 1.154

In which s is the standard deviation of the sample and n is the size of the sample.

The lower end of the interval is the sample mean subtracted by M. So it is 6.25 - 1.154 = 5.096 hours

The upper end of the interval is the sample mean added to M. So it is 6.25 + 1.154 = 7.404 hours.

The answer is (5.097, 7.404)

User Robertzp
by
4.3k points