220k views
0 votes
The length (in inches) and weight (in ounces) for a type of bass were measured for a random sample of 10 bass from a lake. These measurements were then analyzed and the results are given in the computer output.

Which of the following represents the average distance between the actual weight and the predicted weight of the bass?

1.80
3.771
13.610
24.87

1 Answer

2 votes

The average distance between the actual weight and the predicted weight of the bass is given by the Mean Absolute Error . The Mean Absolute Error is calculated by taking the average of the absolute differences between the actual and predicted values. The average distance between the actual weight and the predicted weight of the bass is 3.771 ounces.

Mean Absolute Error

Mean Absolute Error = 1/n * Σ (i=1 to n) |y_i -Ycap_i|

In this formula:

n is the number of observations in the data.Σ represents the sum over all observations (i = 1 to n).y_i is the actual value of the i-th observation Ycap_i is the predicted value of the i-th observation.|...| represents the absolute value, which means the distance between the actual and predicted values regardless of the sign.

So, the Mean Absolute Error is the average absolute difference between the actual and predicted weights.

In the context of the given problem, the Mean Absolute Error represents the average distance between the actual and predicted weights of the bass. A lower Mean Absolute Error indicates a better predictive model, as it means the predicted weights are, on average, closer to the actual weights.

Therefore, the correct answer is 3.771, as it represents the average distance between the actual and predicted weights of the bass based on the given computer output.

User Mohi
by
8.7k points