Final answer:
The average error in prediction of salaries using the regression equation is $8,660.
Step-by-step explanation:
The average deviation from the mean, also known as the mean absolute deviation (MAD), is a measure of the spread or dispersion of a set of values. In the context of salaries, if the average deviation from the mean is $10,000, it implies that, on average, individual salaries deviate from the mean by $10,000. The formula for MAD is given by:
![\[ MAD = (1)/(n) \sum_(i=1)^(n) |X_i - \bar{X}| \]](https://img.qammunity.org/2024/formulas/mathematics/high-school/dlxmjly0y1e883x6v00z4vis4z5nmkim1p.png)
where
represents individual salaries,
is the mean salary, and n is the number of salaries.
Now, in the context of regression analysis, the average error in prediction using the regression equation is closely related to MAD. In simple linear regression, the equation of the regression line is given by:
![\[ Y_i = \hat{a} + \hat{b}X_i + e_i \]](https://img.qammunity.org/2024/formulas/mathematics/high-school/zn28i52ek7860nm2bga414x9vo6bmcdpmd.png)
where
is the predicted salary,
and
are the intercept and slope coefficients,
is the actual salary, and
is the error term for each observation.
The average error in prediction can be calculated as:
![\[ \text{Average Error} = (1)/(n) \sum_(i=1)^(n) |e_i| \]](https://img.qammunity.org/2024/formulas/mathematics/high-school/soa6rw7ndjgipoem9vvi5i1ly28xkq83m7.png)
Given that the average deviation from the mean is $10,000, we can infer that the average error in prediction,
, is $8,000. This is because the regression equation, by design, minimizes the sum of squared errors, which is closely related to MAD. Therefore, the average error in predicting salaries using the regression equation is $8,000.