Forecasting Models ...
Prophecy supports all the forecasting models and statistical modelling techniques you will need for business demand forecasting and planning.
Here is a brief overview of the forecasting models supplied with Prophecy, along with curated links containing further information on each forecasting method.
We present the statistical forecasting methods in alphabetical order. It is believed that more modern methods, such as Machine Learning and Neural (both supported in Prophecy) are most likely to produce best results, but there is no substitute for testing all methods with your data in Prophecy.
-
Arima
ARIMA stands for "Autoregressive integrated moving average". Returns best ARIMA model according to either AIC, AICc or BIC value. The function conducts a search over possible models within the default order constraints.
-
BATS
Exponential smoothing state space model with Box-Cox transformation, ARMA errors, Trend and Seasonal components. As described in De Livera, Hyndman & Snyder (2011).
-
Croston
Based on Croston's (1972) method for intermittent demand forecasting, also described in Shenstone and Hyndman (2005). Croston's method involves using simple exponential smoothing (SES) on the non-zero elements of the time series and a separate application of SES to the times between non-zero elements of the time series.
-
ETS
Exponential smoothing state space model.
-
HoltWinters
Parameters are determined by minimizing the squared prediction error.
-
Machine Learning - XGBoost
XGBoost is a machine learning (ML) library which implements the gradient boosting decision tree algorithm. This algorithm goes by lots of different names such as gradient boosting, multiple additive regression trees, stochastic gradient boosting or gradient boosting machines. Boosting is an ensemble technique where new models are added to correct the errors made by existing models. Models are added sequentially until no further improvements can be made.
Gradient boosting is an approach where new models are created that predict the residuals or errors of prior models and then added together to make the final prediction. It is called gradient boosting because it uses a gradient descent algorithm to minimize the loss when adding new models. This approach supports both regression and classification predictive modeling problems. XGBoost has proven very successful in machine learning competitions and also executes quickly in R.
-
Multi-model Average
Averages the forecasts from Arima, BATS, Croston, ETS, HoltWinters. As suggested by https://otexts.com/fpp2/combinations.html.
-
Multi-model Tournament
Runs a tournament of methods (Arima, BATS, Croston, ETS, HoltWinters) and chooses the method with the lowest MAPE, as calculated by the accuracy() function of the forecast library.
-
Neural Network (Auto Regressive)
Feed-forward neural networks with a single hidden layer and lagged inputs for forecasting univariate time series.
-
Prophet
Prophet is a procedure for forecasting time series data based on an additive model where non-linear trends are fit with yearly, weekly, and daily seasonality, plus holiday effects. It works best with time series that have strong seasonal effects and several seasons of historical data. Prophet is robust to missing data and shifts in the trend, and typically handles outliers well. Prophet is open source software released by Facebook's Core Data Science team.
-
STL
Seasonal Decomposition of Time Series by Loess.
-
TBATS
Trigonometric Seasonal, Box-Cox Transformation, ARMA residuals, Trend and Seasonality