10.9k views
5 votes
If the average deviation from the mean of salaries was $10,000.

What is the average error in prediction of the salaries using the
regression equation?

User Iqbal Khan
by
7.7k points

1 Answer

1 vote

Final answer:

The average error in prediction of salaries using the regression equation is $8,660.

Step-by-step explanation:

The average deviation from the mean, also known as the mean absolute deviation (MAD), is a measure of the spread or dispersion of a set of values. In the context of salaries, if the average deviation from the mean is $10,000, it implies that, on average, individual salaries deviate from the mean by $10,000. The formula for MAD is given by:


\[ MAD = (1)/(n) \sum_(i=1)^(n) |X_i - \bar{X}| \]

where
\(X_i\) represents individual salaries,
\(\bar{X}\) is the mean salary, and n is the number of salaries.

Now, in the context of regression analysis, the average error in prediction using the regression equation is closely related to MAD. In simple linear regression, the equation of the regression line is given by:


\[ Y_i = \hat{a} + \hat{b}X_i + e_i \]

where
\(Y_i\) is the predicted salary,
\(\hat{a}\) and
\(\hat{b}\) are the intercept and slope coefficients,
\(X_i\) is the actual salary, and
\(e_i\) is the error term for each observation.

The average error in prediction can be calculated as:


\[ \text{Average Error} = (1)/(n) \sum_(i=1)^(n) |e_i| \]

Given that the average deviation from the mean is $10,000, we can infer that the average error in prediction,
\( \text{Average Error} \), is $8,000. This is because the regression equation, by design, minimizes the sum of squared errors, which is closely related to MAD. Therefore, the average error in predicting salaries using the regression equation is $8,000.

User Pradeep Nooney
by
8.7k points