138k views
2 votes
The average (mean) IQ of all students in the U.S. is called a

User Anddy
by
7.9k points

1 Answer

3 votes

Final answer:

The average (mean) IQ score is set at 100 with a standard deviation of 15 points. IQ scores within one standard deviation of the mean (85-115) are considered average. For small sample sizes, a Student's t-distribution should be used in hypothesis testing when the population standard deviation is known.

Step-by-step explanation:

The average IQ, or intelligence quotient, is a measure used to describe the mean level of cognitive abilities among a group, such as students in the U.S. In the context of IQ testing, the average score is typically set at 100, with a standard deviation of 15 points.

This means that scores falling between 85 to 115 are considered within the average range, encompassing about 68% of the population. An IQ score of 70 would be considered two standard deviations below the mean and potentially indicative of an intellectual disability, while a score of 130 or higher might suggest giftedness or superior intelligence.

In the example provided, a typical adult has an average IQ score of 105 with a standard deviation of 20. To calculate the probability that the sample mean for 20 adults falls between 85 and 125, one would use the normal distribution of the sample means, which can be estimated using the central limit theorem for larger sample sizes.

For hypothesis testing on IQ scores, such as determining if the mean IQ score on the Stanford-Binet IQ test is more than 100, the Student's t-distribution would be the appropriate choice when the sample size is small and the population standard deviation is known.

User Mehany
by
7.9k points