Evaluating demand forecasting models using multi-criteria decision-making approach

PurposeDemand forecasting models in companies are often a mix of quantitative models and qualitative methods. As there are so many existing forecasting approaches, many forecasters have difficulty in deciding on which model to select as they may perform “best” in a specific error measure, and not in another. Currently, there is no approach that evaluates different model classes and several interdependent error measures simultaneously, making forecasting model selection particularly difficult when error measures yield conflicting results.Design/methodology/approachThis paper proposes a novel procedure of multi-criteria evaluation of demand forecasting models, simultaneously considering several error measures and their interdependencies based on a two-stage multi-criteria decision-making approach. Analytical Network Process combined with the Technique for Order of Preference by Similarity to Ideal Solution (ANP-TOPSIS) is developed, evaluated and validated through an implementation case of a plastic bag manufacturer.FindingsThe results show that the approach identifies the best forecasting model when considering many error measures, even in the presence of conflicting error measures. Furthermore, considering the interdependence between error measures is essential to determine their relative importance for the final ranking calculation.Originality/valueThe paper's contribution is a novel multi-criteria approach to evaluate multiclass demand forecasting models and select the best model, considering several interdependent error measures simultaneously, which is lacking in the literature. The work helps structuring decision making in forecasting and avoiding the selection of inappropriate or “worse” forecasting model.

[1]  R. Fildes,et al.  Measuring forecasting accuracy : the case of judgmental adjustments to SKU-level demand forecasts , 2013 .

[2]  R. Fildes,et al.  Effective forecasting and judgmental adjustments: an empirical evaluation and strategies for improvement in supply-chain planning , 2009 .

[3]  Naoufel Cheikhrouhou,et al.  A collaborative demand forecasting process with event-based fuzzy judgements , 2011, Comput. Ind. Eng..

[4]  Bernard J. Morzuch,et al.  Evaluating Time-Series Models to Forecast the Demand for Tourism in Singapore , 2005 .

[5]  Jing Wang,et al.  Brain imaging and forecasting: Insights from judgmental model selection , 2019, Omega.

[6]  L. Ren,et al.  Applicability of the Revised Mean Absolute Percentage Errors (MAPE) Approach to Some Popular Normal and Non-normal Independent Time Series , 2009 .

[7]  Peter Loos,et al.  Evaluating Forecasting Methods by Considering Different Accuracy Measures , 2016 .

[8]  Rob J Hyndman,et al.  Another look at measures of forecast accuracy , 2006 .

[9]  Naoufel Cheikhrouhou,et al.  A multi criteria group decision making approach for collaborative software selection problem , 2014, J. Intell. Fuzzy Syst..

[10]  Enno Siemsen,et al.  Integrating human judgement into quantitative forecasting methods: A review , 2019, Omega.

[11]  Gang Kou,et al.  A simple method to improve the consistency ratio of the pair-wise comparison matrix in ANP , 2011, Eur. J. Oper. Res..

[12]  Emmanuel Sirimal Silva,et al.  Forecasting the price of gold , 2015 .

[13]  Philip Hans Franses,et al.  Do experts' adjustments on model-based SKU-level forecasts improve forecast quality? , 2009 .

[14]  Bing Xu,et al.  Performance evaluation of competing forecasting models: A multidimensional framework based on MCDA , 2012, Expert Syst. Appl..

[15]  P. McSharry,et al.  Short-Term Load Forecasting Methods: An Evaluation Based on European Data , 2007, IEEE Transactions on Power Systems.

[16]  Adel A. Ghobbar,et al.  Evaluation of forecasting methods for intermittent parts demand in the field of aviation: a predictive model , 2003, Comput. Oper. Res..

[17]  Jack T. Dennerlein,et al.  Expertise, credibility of system forecasts and integration methods in judgmental demand forecasting , 2017 .

[18]  H. Akaike A new look at the statistical model identification , 1974 .

[19]  Patrick T. Hester,et al.  An Analysis of Multi-Criteria Decision Making Methods , 2013 .

[20]  Kamal M. Al‐Subhi Al‐Harbi,et al.  Application of the AHP in project management , 2001 .

[21]  T. Saaty How to Make a Decision: The Analytic Hierarchy Process , 1990 .

[22]  Robert Fildes,et al.  Evaluating the forecasting performance of econometric models of air passenger traffic flows using multiple error measures , 2011 .

[23]  Philip Hans Franses,et al.  Averaging Model Forecasts and Expert Forecasts: Why Does It Work? , 2011, Interfaces.

[24]  Sangheon Han How can we handle too many criteria/alternatives? : A study on AHP structural design , 2016 .

[25]  Hyesung Seok,et al.  Evaluation of forecasting methods in aggregate production planning: A Cumulative Absolute Forecast Error (CAFE) , 2018, Comput. Ind. Eng..

[26]  Ann Vereecke,et al.  Judgmental forecast adjustments over different time horizons , 2019, Omega.

[27]  Juan R. Trapero,et al.  Impact of Information Exchange on Supplier Forecasting Performance , 2012 .

[28]  Marco A. Villegas,et al.  A support vector machine for model selection in demand forecasting applications , 2018, Comput. Ind. Eng..

[29]  Naoufel Cheikhrouhou,et al.  Structuring and integrating human knowledge in demand forecasting: a judgemental adjustment approach , 2010 .

[30]  Fotios Petropoulos,et al.  Judgmental selection of forecasting models , 2018 .

[31]  J. Armstrong,et al.  Evaluating Forecasting Methods , 2001 .

[32]  Mehpare Timor,et al.  The analytic hierarchy process and analytic network process: an overview of applications , 2010 .

[33]  Xiaodong Liu,et al.  An integrated multiple criteria decision making model applying axiomatic fuzzy set theory , 2012 .

[34]  Emmanuel Sirimal Silva,et al.  A Kolmogorov-Smirnov Based Test for Comparing the Predictive Accuracy of Two Sets of Forecasts , 2015 .