60.7k views
4 votes
A nationwide census is conducted and it is found that the mean number of hours of television watched per year by Americans is 350 with a standard deviation of 220. Furthermore, the data appears to be normally distributed! Using this information, determine the probability that a group of 4 Americans watch MORE THAN 400 hours of television per year. SHOW ALL WORK.

2 Answers

3 votes

Final answer:

To find the probability that a group of 4 Americans watch more than 400 hours of television per year, calculate the standard deviation of the sample mean, find the z-score for 400 hours, and use a standard normal distribution table to find the probability.

Step-by-step explanation:

To determine the probability that a group of 4 Americans watch more than 400 hours of television per year, we can use the information given about the mean and standard deviation of the number of hours watched per year by Americans.

First, we need to calculate the standard deviation of the sample mean, which is the standard deviation of the population divided by the square root of the sample size. In this case, the standard deviation of the population is 220, and the sample size is 4, so the standard deviation of the sample mean is 220 / sqrt(4) = 110.

Next, we need to convert the given value of 400 hours to a z-score using the formula z = (x - mean) / standard deviation. Plugging in the values, we get z = (400 - 350) / 110 = 0.45.

Finally, we can use a standard normal distribution table or a calculator to find the probability that a z-score is greater than 0.45. The probability is approximately 0.674.

So, the probability that a group of 4 Americans watch more than 400 hours of television per year is approximately 0.674.

User Carl Sverre
by
4.5k points
7 votes

Answer:

Probability that a group of 4 Americans watch more than 400 hours of television per year is 0.3264.

Step-by-step explanation:

We are given that a nationwide census is conducted and it is found that the mean number of hours of television watched per year by Americans is 350 with a standard deviation of 220.

A group of 4 Americans is selected.

Let
\bar X = sample mean number of hours of television watched per year

The z score probability distribution for sample mean is given by;

Z =
(\bar X-\mu)/((\sigma)/(√(n) ) ) ~ N(0,1)

where,
\mu = population mean = 350


\sigma = standard deviation = 220

n = sample of Americans = 4

Now, the probability that a group of 4 Americans watch more than 400 hours of television per year is given by = P(
\bar X > 400 hours)

P(
\bar X > 400) = P(
(\bar X-\mu)/((\sigma)/(√(n) ) ) >
(400-350)/((220)/(√(4) ) ) ) = P(Z > 0.45) = 1 - P(Z
\leq 0.45)

= 1 - 0.6736 = 0.3264

The above probability is calculated by looking at the value of x = 0.45 in the z table which has an area of 0.6736.

Hence, the probability that a group of 4 Americans watch more than 400 hours of television per year is 0.3264.

User BandGap
by
4.6k points