84.1k views
3 votes
By definition, iq scores follow a normal distribution. suppose that a specific iq test is devised to have a mean of 100 and a standard deviation of 20. randomly select 25 adults.

(a) determine p(x₁ > 105)
(b) determine p(x > 105)
(c) compute p(98 < x ≤ 104)
(d) determine p(x < 100)
(e) find p(s₂ > 240)
(f) compute p(s ≤ 18)

User Moriarty
by
8.4k points

1 Answer

4 votes

Final answer:

The question revolves around calculating probabilities for normally distributed IQ scores with a mean of 105 and a standard deviation of 20, for a sample size of 20. The central limit theorem is used to find the sampling distribution's standard deviation for this sample size, and probabilities are derived using z-scores.

Step-by-step explanation:

The subject of this question concerns the mathematical concept of normal distribution, specifically related to IQ scores. When addressing the question of the added scenario where a typical adult has an average IQ of 105 with a standard deviation of 20, and considering a sample size of 20 randomly selected adults taking an IQ test, the probability can be computed using the properties of normal distribution. For example, to calculate the probability that the sample mean scores will fall between 85 and 125 points, one would use the central limit theorem which dictates that the sampling distribution of the sample means will also be normally distributed with the same mean as the population mean and a standard deviation (σ/n1/2), where n is the sample size.

For a normally distributed variable such as IQ scores with given mean (μ) and standard deviation (σ), various probabilities can be calculated using the z-score formula, which is z = (X - μ) / (σ/n1/2) for sample means. Tools required for these computations include z-tables or statistical software that can perform these calculations. This allows us to find the area under the normal curve for specific ranges, which corresponds to probabilities.

User Uaarkoti
by
8.8k points