Improving Demand Forecast Accuracy Series: Understanding Why You’re Measuring Accuracy


Since the introduction of the concept of Demand Planning, forecast accuracy has been a key measure in assessing the performance of planning organizations.  Recently, companies have been asking how and where forecast accuracy measures can provide an advantage across the broader organization.  When performing an assessment on a mature planning solution, one of the standard questions should be: What are your current forecast accuracy time lags and what business rationale was used to select them?

Many times the answer is a standard set of lags (0, 2, 4, 8, and 12) without a clearly defined business rationale for the selection of the time lags outside of a performance of the planning organization.  A new movement is aligning operational activities across the broader organization against the time lags where accuracy is measured.

The pertinent question is: When should you set lag points to measure forecast accuracy in the Planning Application?

The challenge for many planning teams is that they inherit a set of forecast accuracy time lag settings that were conceived and agreed upon by a subset of the organization at the implementation of the original planning solution.  In some instances, these measures were put in place ten years or more in the past.


Organizations should strategically review their existing time lags against the current and planned future business operating model.  The organization as a whole needs to align on what time lags should be measured and how they will be used to achieve the goals of the broader organization (inventory reduction, best pricing for materials, best use of capital, etc.).

Updating the forecast performance time lags should be a key activity in the introduction and periodic process review of any Sales & Operations Planning (S&OP) process.  More progressive organizations may look to align with vendor partners’ forecast accuracy measures as part of a Collaborative, Planning, Forecasting and Replenishment (CPFR) structure.

Once these time lags are agreed upon, a “backcast” activity can be performed either internally or with the help of a consulting partner to construct historical forecast accuracy measures.  This can provide nearly immediate visibility to accuracy exceptions over the backcast horizon.  The organization can then look at opportunities to capitalize on positive exceptions and put structures in place to mitigate the occurrence /impact of negative exceptions.  Trend analysis of backcast data eliminates a portion of the time required to build an adequate set of forecast accuracy measures through the standard process of capturing forecast lag data and then waiting until historic actuals are posted for comparison.


Organizations are using forecast performance measures and trend analysis to impact areas not traditionally exposed to these measures.  The use of this information in the S&OP process, in addition to collaboration with upstream and downstream trading partners, can provide organizations with a competitive advantage in the Supply Chain arena.


We have worked with thousands of clients across a diverse set of industries.
Share on facebook
Share on twitter
Share on linkedin