Nixtla

Neural 🧠 Forecast
User friendly state-of-the-art neural forecasting models


NeuralForecast offers a large collection of neural forecasting models focusing on their performance, usability, and robustness. The models range from classic networks like RNNs to the latest transformers: MLP
, LSTM
, GRU
, RNN
, TCN
, TimesNet
, BiTCN
, DeepAR
, NBEATS
, NBEATSx
, NHITS
, TiDE
, DeepNPTS
, TSMixer
, TSMixerx
, MLPMultivariate
, DLinear
, NLinear
, TFT
, Informer
, AutoFormer
, FedFormer
, PatchTST
, iTransformer
, StemGNN
, and TimeLLM
.
Installation
You can install NeuralForecast
with:
pip install neuralforecast
or
conda install -c conda-forge neuralforecast
Vist our Installation Guide for further details.
Quick Start
Minimal Example
from neuralforecast import NeuralForecast
from neuralforecast.models import NBEATS
from neuralforecast.utils import AirPassengersDF
nf = NeuralForecast(
models = [NBEATS(input_size=24, h=12, max_steps=100)],
freq = 'ME'
)
nf.fit(df=AirPassengersDF)
nf.predict()
Get Started with this quick guide.
Why?
There is a shared belief in Neural forecasting methods' capacity to improve forecasting pipeline's accuracy and efficiency.
Unfortunately, available implementations and published research are yet to realize neural networks' potential. They are hard to use and continuously fail to improve over statistical methods while being computationally prohibitive. For this reason, we created NeuralForecast
, a library favoring proven accurate and efficient models focusing on their usability.
Features
- Fast and accurate implementations of more than 30 state-of-the-art models. See the entire collection here.
- Support for exogenous variables and static covariates.
- Interpretability methods for trend, seasonality and exogenous components.
- Probabilistic Forecasting with adapters for quantile losses and parametric distributions.
- Train and Evaluation Losses with scale-dependent, percentage and scale independent errors, and parametric likelihoods.
- Automatic Model Selection with distributed automatic hyperparameter tuning.
- Familiar sklearn syntax:
.fit
and .predict
.
Highlights
- Official
NHITS
implementation, published at AAAI 2023. See paper and experiments. - Official
NBEATSx
implementation, published at the International Journal of Forecasting. See paper. - Unified with
StatsForecast
, MLForecast
, and HierarchicalForecast
interface NeuralForecast().fit(Y_df).predict()
, inputs and outputs. - Built-in integrations with
utilsforecast
and coreforecast
for visualization and data-wrangling efficient methods. - Integrations with
Ray
and Optuna
for automatic hyperparameter optimization. - Predict with little to no history using Transfer learning. Check the experiments here.
Missing something? Please open an issue or write us in 
Examples and Guides
The documentation page contains all the examples and tutorials.
📈 Automatic Hyperparameter Optimization: Easy and Scalable Automatic Hyperparameter Optimization with Auto
models on Ray
or Optuna
.
🌡️ Exogenous Regressors: How to incorporate static or temporal exogenous covariates like weather or prices.
🔌 Transformer Models: Learn how to forecast with many state-of-the-art Transformers models.
👑 Hierarchical Forecasting: forecast series with very few non-zero observations.
👩🔬 Add Your Own Model: Learn how to add a new model to the library.
Models
See the entire collection here.
Missing a model? Please open an issue or write us in 
How to contribute
If you wish to contribute to the project, please refer to our contribution guidelines.
References
This work is highly influenced by the fantastic work of previous contributors and other scholars on the neural forecasting methods presented here. We want to highlight the work of Boris Oreshkin, Slawek Smyl, Bryan Lim, and David Salinas. We refer to Benidis et al. for a comprehensive survey of neural forecasting methods.
🙏 How to cite
If you enjoy or benefit from using these Python implementations, a citation to the repository will be greatly appreciated.
@misc{olivares2022library_neuralforecast,
author={Kin G. Olivares and
Cristian Challú and
Azul Garza and
Max Mergenthaler Canseco and
Artur Dubrawski},
title = {{NeuralForecast}: User friendly state-of-the-art neural forecasting models.},
year={2022},
howpublished={{PyCon} Salt Lake City, Utah, US 2022},
url={https://github.com/Nixtla/neuralforecast}
}
Contributors ✨
Thanks goes to these wonderful people (emoji key):
This project follows the all-contributors specification. Contributions of any kind welcome!