Since the first principle of forecasting is that forecasts are (almost) always wrong, organizations need to track the forecast against actual demand results and find ways to measure the size and type of error. Note that the size of an error can be measured in units or percentages, but often finding a way to put a monetary value on the error can help in focusing.
Forecast Error
Forecast error is the difference between actual demand and forecast demand, stated as an absolute value or as a percentage.
Forecast Error =  A – F 
Forecast Error as Percentage =  A – F  / A Where: A = Actual demand F = Forecast demand 
Forecast Accuracy
Forecast accuracy is simply the inverse of the forecast error as a percentage, expressed as follows:
Forecast Accuracy = 1 – Forecast Error as Percentage 
Bias and Random Variation
Bias
 Forecast error can be the result of bias or random variation.
 Bias is a consistent deviation from the mean in one direction (high or low). A normal property of a good forecast is that it is not biased.
 Bias exist when the cumulative actual demand differs from the cumulative actual forecast.
Cumulative Forecast Error = Cumulative Actual Demand – Cumulative Forecast Demand 
 Any answer that does not result in zero reflects a bias.
 The size of the number reflects the relative amount of bias that it present.
 A negative result shows that actual demand was consistently less than the forecast, while positive result shows that actual demand was greater than forecast demand.
Random Variation
In terms of measuring errors, random variation is any amount of variation in which the cumulative actual demand equals the cumulative forecast demand.
Mean Absolute Deviation (MAD)
 A common way of tracking the extent of forecast error is to add the absolute period errors for a series of periods and divide by the number of periods. This give you Mean Absolute Deviation (MAD).
MAD = ∑ A – F / n
Where: A – F = Total of absolute forecast errors for the periods n = Number of periods 
 The average of the absolute values of the deviations of observed values from some expected value. It can be calculated based on observations and the arithmetic mean of those observations. An alternative is to calculate absolute deviations of actual sales minus forecast data. These data can be averaged in the usual arithmetic way or with exponential smoothing.
NOTE: With absolute values, whether the forecast falls short of demand or exceeds demand does not matter; only the magnitude of the deviation counts in MAD.

An analyst would provide actual MADs for a given service level. If a specific service level is desired, such as 98 percent of orders with no stock outs, analyst can calculate the exact MAD to use a s a multiplier in the calculation of units of safety stock. The multiplier is called a safety factor. For example, if a 98 percent service level has a safety factor of 2.56 MAD, the calculation would be as follows:
2.56 Safety Factor x 8.23 MAD in units = 21.07 Units of Safety Stock
Tracking Signal
 The tracking signal is the ratio of the cumulative algebraic sum of the deviations between the forecasts and the actual values to the mean absolute deviation. Used to signal when the validity of the forecasting model might be in doubt.
Tracking Signal = Algebraic Sum of Forecast Errors / Mean Absolute Deviation (MAD) 
 Note that the algebraic sum of forecast errors is a cumulative sum that does not use absolute value for the errors. Therefore, the tracking signal could be either positive or negative to show the direction of the bias. Organizations use a tracking signal by setting a target value for each period, such as ±4. If the tracking signal exceeds this target value, it would trigger a forecast review.
Standard Deviation
In addition to MAD, another way to calculate forecast error would be to use standard deviation, which is commonly provided in most software programs. An approximation for standard deviation when you know the MAD.
Standard Deviation (approximate) = MAD x 1.25 
Mean Squared Deviation (MSE)
 Another method of calculating error rates, the mean squared error (MSE), magnifies the errors by squaring each one before adding them u and dividing by the number of forecast periods.
 Squaring errors effectively makes them absolute since multiplying two negative numbers always results in a positive number.
MSE = ∑(Error for each period)²/ Number of forecast periods 
MSE and MAD Comparison
 Note that the process of squaring of each error gives you a much wider range of numbers.
 The greater range gives you a more sensitive measure of the error rate, which is especially useful if the absolute error numbers are relatively close together and reduction of errors is important.
 Measuring the extent of deviation helps determine the need to improve forecasting or rely on safety stock to meet customer service objectives.
Mean Absolute Percentage Error (MAPE)
 There is a drawback to the MAD calculation, in that it is an absolute number that is not meaningful unless compared to the forecast.
 MAPE is a useful variant of the MAD calculation because it shows the ratio, or percentage, of the absolute errors to the actual demand for a given number of periods.
MAPE = ∑( A – F / A ) % / n 
 Note that the result is expressed as a percentage.
 Exception rules for review can be applied to any stock keeping unit or product family that has a MAPE above a certain percentage value. Percentage – based error measurements such a s MAPE allow the magnitude of error to be clearly seen without needing detailed knowledge of the product or family, whereas when an absolute error in units (or an error in $ amount) is provided, it requires knowing what is considered normal for the product or product family.