38.8k views
5 votes
GameStop is trying to find the average absolute deviation from

monthly forecasts in number of units for Play Station 5 consoles
over a 1 year period. Which error metric should they use?

User Arekzyla
by
8.3k points

1 Answer

4 votes

Final answer:

GameStop should use the mean absolute deviation (MAD) as the error metric to find the average absolute deviation from monthly forecasts in the number of units for PlayStation 5 consoles.

Step-by-step explanation:

The error metric that GameStop should use to find the average absolute deviation from monthly forecasts in number of units for PlayStation 5 consoles over a 1-year period is the mean absolute deviation (MAD).

MAD measures the average distance between each data point and the mean of the data set. It provides a measure of how much the data varies from the average. To calculate the MAD, subtract the mean from each data point, take the absolute value of the difference, and then find the average of these absolute differences.

Using the MAD as the error metric will give GameStop insight into how accurate their monthly forecasts have been and help them assess the variability in their sales numbers for PlayStation 5 consoles.

User Igal S
by
9.3k points