211k views
2 votes
The time college students spend on the Internet follows a Normal distribution. At Johnson

University, the mean time is 5 hours with a standard deviation of 1.2 hours. What is the
probability that the average time that 100 random students on campus will spend on the Internet
is more than 5 hours?
a. 0.5
b. 1
c. 0
d. 0.2

User Ashok Goli
by
6.3k points

1 Answer

4 votes

Answer:

To solve this problem, we can use the Central Limit Theorem, which states that the sampling distribution of the sample mean approaches a normal distribution as the sample size increases.

Given that the population mean is 5 hours (μ = 5) and the population standard deviation is 1.2 hours (σ = 1.2), we can calculate the probability that the average time for a sample of 100 random students is more than 5 hours.

First, we need to calculate the standard deviation of the sample mean, also known as the standard error (SE), which is equal to the population standard deviation divided by the square root of the sample size:

SE = σ / √n

SE = 1.2 / √100

SE = 1.2 / 10

SE = 0.12

Next, we can calculate the z-score corresponding to the value of 5 hours using the formula:

z = (x - μ) / SE

z = (5 - 5) / 0.12

z = 0 / 0.12

z = 0

Since the z-score is 0, the probability of getting a sample mean greater than 5 hours is equal to the probability of getting a z-score greater than 0.

To find this probability, we can look up the z-table or use a calculator. The probability of getting a z-score greater than 0 is 0.5 (or 50%).

Therefore, the probability that the average time that 100 random students on campus will spend on the Internet is more than 5 hours is 0.5 or 50%.

User Jeff Grimes
by
6.8k points