According to the Mckinsey report, in the consumer goods domain, improving the accuracy of demand forecasting by 10-20% can reduce inventory by 5% and increase revenue by 2-3%. Current ML-based forecasting solutions require significant manual effort, including model construction, feature engineering and hyper-parameter tuning.
Last week, Google and Facebook especially, have come up with two new frameworks for solving time series problems with great ease. While Google took the AutoML route, Facebook AI spun an architecture out of their popular Prophet tool. In this article, we take a look at those two solutions which underline the growing importance of deep learning-based time series solutions.
Google’s AutoML Solution
“The AutoML solution required moderate compute cost, only 500 CPUs for 2 hours to be at the top of the Kaggle competition.”
Google’s AI team have introduced a scalable end-to-end AutoML solution for time series forecasting. However, a fully automated generic solution is challenging as the solution should work for different datasets, which belong to different domains and have their own demands(hourly, daily or weekly).
This end-to-end pipeline is built using TensorFlow with a specialised search space for time series forecasting. The solution is based on an encoder-decoder architecture. The encoder transforms the historical information in a time series into a set of vectors, and the decoder generates future predictions based on these vectors. The resulting AutoML solution searches for the best combination of components such as attention dilated convolution, gating and skip connections to make the decision.
To test the accuracy of their AutoML solution, Google team tested their solution in the popular M5 forecasting competition, with a long history spanning nearly 40 years. The fully automated solution achieved a rank of 138 out of 5558 participants (top 2.5%) on the Kaggle. According to Google, other top forecasting models required months of manual effort to create, whereas Google’s AutoML solution needed 500 CPUs running for two hours to finish the competition.
Facebook’s NeuralProphet Solution
A week ago, Facebook also announced the release of NeuralProphet, a Neural Network based PyTorch implementation of time series forecasting tool, inspired by popular forecasting tool Prophet. The NeuralProphet documentation states that it is developed in a fully modular architecture and is flexible to take in any additional components in the future. The developers wrote that their vision is to develop a simple to use forecasting tool for users while ensuring interpretability, configurability and providing much more such as the automatic differencing capabilities by using PyTorch as the backend.
NeuralProphet consists of components like seasonality, auto-regression, special events, future regressors and lagged regressors. For instance, seasonality is modelled using Fourier terms and can handle multiple seasonalities for high-frequency data. Auto-regression is handled using an implementation of an Auto-Regressive Feed-Forward Neural Network for time series.
According to Facebook, NeuralProphet can be used to build forecasting models which are driven by other external factors that dictate the behaviour of the target series over time. External information can heavily improve forecasting models as they don’t rely only on the autocorrelation of the series. NeuralProphet tool is the right fit for those who wish to gain insights into the overall modelling process by visualising the forecasts, the individual components as well as the underlying coefficients of the model.
According to the team, users can visualise the interaction of the individual components. They also have the power to control these coefficients as required by introducing sparsity through regularisation. They can combine the components additively or multiplicatively as per their domain knowledge.
NeuralProphet has the following features:
- Gradient Descent for optimisation via using PyTorch as the backend.
- Modelling autocorrelation of time series using AR-Net
- Modelling lagged regressors using a separate Feed-Forward Neural Network.
- Configurable nonlinear deep layers of the FFNNs.
- Tuneable to specific forecast horizons.
- Custom losses and metrics.