Skip to content

thuml/Time-Series-Library

Repository files navigation

Time Series Library (TSLib)

TSLib is an open-source library for deep learning researchers, especially for deep time series analysis.

中文文档README_zh.md

We provide a neat code base to evaluate advanced deep time series models or develop your model, which covers five mainstream tasks: long- and short-term forecasting, imputation, anomaly detection, and classification.

🚩News (2025.12) Many thanks to the great work from ailuntz, which provides an updated requirements and docker deployment, as well as a well-organized document. This is quite meaningful to this project and beginners.

🚩News (2025.11) Considering the rapid development of Large Time Series Models (LTSMs), we have newly added a [zero-shot forecasting] feature in TSLib. You can try this script to evaluate LTSMs.

🚩News (2025.10) Given the recent confusion among researchers regarding minor improvements on standard benchmarks, we propose the [Accuracy Law] to characterize the objectives of deep time series forecasting tasks, which can be used to identify saturated datasets.

🚩News (2024.10) We have included [TimeXer], which defined a practical forecasting paradigm: Forecasting with Exogenous Variables. Considering both practicability and computation efficiency, we believe the new forecasting paradigm defined in TimeXer can be the "right" task for future research.

🚩News (2024.10) Our lab has open-sourced [OpenLTM], which provides a distinct pretrain-finetuning paradigm compared to TSLib. If you are interested in Large Time Series Models, you may find this repository helpful.

🚩News (2024.07) We wrote a comprehensive survey of [Deep Time Series Models] with a rigorous benchmark based on TSLib. In this paper, we summarized the design principles of current time series models supported by insightful experiments, hoping to be helpful to future research.

🚩News (2024.04) Many thanks for the great work from frecklebars. The famous sequential model Mamba has been included in our library. See this file, where you need to install mamba_ssm with pip at first.

🚩News (2024.03) Given the inconsistent look-back length of various papers, we split the long-term forecasting in the leaderboard into two categories: Look-Back-96 and Look-Back-Searching. We recommend researchers read TimeMixer, which includes both look-back length settings in experiments for scientific rigor.

🚩News (2023.10) We add an implementation to iTransformer, which is the state-of-the-art model for long-term forecasting. The official code and complete scripts of iTransformer can be found here.

🚩News (2023.09) We added a detailed tutorial for TimesNet and this library, which is quite friendly to beginners of deep time series analysis.

🚩News (2023.02) We release the TSlib as a comprehensive benchmark and code base for time series models, which is extended from our previous GitHub repository Autoformer.

Leaderboard for Time Series Analysis

Till March 2024, the top three models for five different tasks are:

Model
Ranking
Long-term
Forecasting
Look-Back-96
Long-term
Forecasting
Look-Back-Searching
Short-term
Forecasting
Imputation Classification Anomaly
Detection
🥇 1st TimeXer TimeMixer TimesNet TimesNet TimesNet TimesNet
🥈 2nd iTransformer PatchTST Non-stationary
Transformer
Non-stationary
Transformer
Non-stationary
Transformer
FEDformer
🥉 3rd TimeMixer DLinear FEDformer Autoformer Informer Autoformer

Note: We will keep updating this leaderboard. If you have proposed advanced and awesome models, you can send us your paper/code link or raise a pull request. We will add them to this repo and update the leaderboard as soon as possible.

Compared models of this leaderboard. ☑ means that their codes have already been included in this repo.

  • TimeXer - TimeXer: Empowering Transformers for Time Series Forecasting with Exogenous Variables [NeurIPS 2024] [Code]
  • TimeMixer - TimeMixer: Decomposable Multiscale Mixing for Time Series Forecasting [ICLR 2024] [Code].
  • TSMixer - TSMixer: An All-MLP Architecture for Time Series Forecasting [arXiv 2023] [Code]
  • iTransformer - iTransformer: Inverted Transformers Are Effective for Time Series Forecasting [ICLR 2024] [Code].
  • PatchTST - A Time Series is Worth 64 Words: Long-term Forecasting with Transformers [ICLR 2023] [Code].
  • TimesNet - TimesNet: Temporal 2D-Variation Modeling for General Time Series Analysis [ICLR 2023] [Code].
  • DLinear - Are Transformers Effective for Time Series Forecasting? [AAAI 2023] [Code].
  • LightTS - Less Is More: Fast Multivariate Time Series Forecasting with Light Sampling-oriented MLP Structures [arXiv 2022] [Code].
  • ETSformer - ETSformer: Exponential Smoothing Transformers for Time-series Forecasting [arXiv 2022] [Code].
  • Non-stationary Transformer - Non-stationary Transformers: Exploring the Stationarity in Time Series Forecasting [NeurIPS 2022] [Code].
  • FEDformer - FEDformer: Frequency Enhanced Decomposed Transformer for Long-term Series Forecasting [ICML 2022] [Code].
  • Pyraformer - Pyraformer: Low-complexity Pyramidal Attention for Long-range Time Series Modeling and Forecasting [ICLR 2022] [Code].
  • Autoformer - Autoformer: Decomposition Transformers with Auto-Correlation for Long-Term Series Forecasting [NeurIPS 2021] [Code].
  • Informer - Informer: Beyond Efficient Transformer for Long Sequence Time-Series Forecasting [AAAI 2021] [Code].
  • Reformer - Reformer: The Efficient Transformer [ICLR 2020] [Code].
  • Transformer - Attention is All You Need [NeurIPS 2017] [Code].

See our latest paper [TimesNet] for the comprehensive benchmark. We will release a real-time updated online version soon.

Newly added baselines. We will add them to the leaderboard after a comprehensive evaluation.

  • TimeFilter - TimeFilter: Patch-Specific Spatial-Temporal Graph Filtration for Time Series Forecasting [ICML 2025] [Code]
  • KAN-AD - KAN-AD: Time Series Anomaly Detection with Kolmogorov-Arnold Networks [ICML 2025] [Code]
  • MultiPatchFormer - A multiscale model for multivariate time series forecasting [Scientific Reports 2025] [Code]
  • WPMixer - WPMixer: Efficient Multi-Resolution Mixing for Long-Term Time Series Forecasting [AAAI 2025] [Code]
  • MSGNet - MSGNet: Learning Multi-Scale Inter-Series Correlations for Multivariate Time Series Forecasting [AAAI 2024] [Code]
  • PAttn - Are Language Models Actually Useful for Time Series Forecasting? [NeurIPS 2024] [Code]
  • Mamba - Mamba: Linear-Time Sequence Modeling with Selective State Spaces [arXiv 2023] [Code]
  • SegRNN - SegRNN: Segment Recurrent Neural Network for Long-Term Time Series Forecasting [arXiv 2023] [Code].
  • Koopa - Koopa: Learning Non-stationary Time Series Dynamics with Koopman Predictors [NeurIPS 2023] [Code].
  • FreTS - Frequency-domain MLPs are More Effective Learners in Time Series Forecasting [NeurIPS 2023] [Code].
  • MICN - MICN: Multi-scale Local and Global Context Modeling for Long-term Series Forecasting [ICLR 2023][Code].
  • Crossformer - Crossformer: Transformer Utilizing Cross-Dimension Dependency for Multivariate Time Series Forecasting [ICLR 2023][Code].
  • TiDE - Long-term Forecasting with TiDE: Time-series Dense Encoder [arXiv 2023] [Code].
  • SCINet - SCINet: Time Series Modeling and Forecasting with Sample Convolution and Interaction [NeurIPS 2022][Code].
  • FiLM - FiLM: Frequency improved Legendre Memory Model for Long-term Time Series Forecasting [NeurIPS 2022][Code].
  • TFT - Temporal Fusion Transformers for Interpretable Multi-horizon Time Series Forecasting [arXiv 2019][Code].

Newly added Large Time Series Models. This library also supports the zero-shot evaluation of the following LTSMs.

  • Chronos2 - Chronos-2: From Univariate to Universal Forecasting [arXiv 2025] [Code]
  • TiRex - TiRex: Zero-Shot Forecasting Across Long and Short Horizons with Enhanced In-Context Learning [NeurIPS 2025] [Code]
  • Sundial - Sundial: A Family of Highly Capable Time Series Foundation Models [ICML 2025] [Code]
  • Time-MoE - Time-MoE: Billion-Scale Time Series Foundation Models with Mixture of Experts [ICLR 2025] [Code]
  • Toto - Toto: Time Series Optimized Transformer for Observability arXiv 2024
  • Chronos - Chronos: Learning the Language of Time Series [TMLR 2024] [Code]
  • Moirai - Unified Training of Universal Time Series Forecasting Transformers [ICML 2024]
  • TimesFM - A decoder-only foundation model for time-series forecasting [ICML 2024] [Code]

Getting Started

Prepare Data

You can obtain the well-preprocessed datasets from [Google Drive], [Baidu Drive] or [Hugging Face]. Then place the downloaded data in the folder ./dataset.

Installation

  1. Clone this repository.

    git clone https://github.com/thuml/Time-Series-Library.git
    cd Time-Series-Library
  2. Create a new Conda environment.

    conda create -n tslib python=3.11
    conda activate tslib
  3. Install Core Dependencies

    pip install -r requirements.txt
  4. Install Dependencies for Mamba Model (Required for Time-Series-Library/models/Mamba.py)

    ⚠️ CUDA Compatibility Notice The prebuilt Mamba wheel is CUDA-version specific. Please make sure to install the wheel that matches your local CUDA version (e.g., cu11 or cu12). Installing a mismatched version may result in runtime errors or import failures.

    Example for CUDA 12:

    pip install https://github.com/state-spaces/mamba/releases/download/v2.2.6.post3/mamba_ssm-2.2.6.post3+cu12torch2.5cxx11abiFALSE-cp311-cp311-linux_x86_64.whl
  5. Install Dependencies for Moirai Model (Required for Time-Series-Library/models/Moirai.py)

    pip install uni2ts --no-deps

Docker Deployment

# Build and start the Docker container in detached mode
docker compose -f 'Time-Series-Library/docker-compose.yml' up -d --build

# Download / place the dataset into a newly created folder ./dataset at the repository root
mkdir -p dataset  # create the dataset directory

# Copy the local dataset into the container at /workspace/dataset
docker cp ./dataset tslib:/workspace/dataset

# Enter the running container to continue training / evaluation
docker exec -it tslib bash

# Switch to the workspace directory inside the container
cd /workspace

# Run zero-shot forecasting with the pre-trained Moirai model
python -u run.py \
  --task_name zero_shot_forecast \   # task type: zero-shot forecasting
  --is_training 0 \                  # 0 = inference only (no training)
  --root_path ./dataset/ETT-small/ \ # root directory of the dataset
  --data_path ETTh1.csv \            # dataset file name
  --model_id ETTh1_512_96 \          # experiment/model identifier
  --model Moirai \                   # model name (TimesFM / Moirai)
  --data ETTh1 \                     # dataset name
  --features M \                     # multivariate forecasting
  --seq_len 512 \                    # input sequence length
  --pred_len 96 \                    # prediction horizon
  --enc_in 7 \                       # number of input variables
  --des 'Exp' \                      # experiment description
  --itr 1                             # number of runs

Quick Test

Quick test for all 5 tasks (1 epoch each):

# Run quick tests for all 5 tasks
export CUDA_VISIBLE_DEVICES=0

# 1. Long-term forecasting
python -u run.py --task_name long_term_forecast --is_training 1 --root_path ./dataset/ETT-small/ --data_path ETTh1.csv --model_id test_long --model DLinear --data ETTh1 --features M --seq_len 96 --pred_len 96 --enc_in 7 --dec_in 7 --c_out 7 --train_epochs 1 --num_workers 2

# 2. Short-term forecasting (using ETT dataset with shorter prediction length)
python -u run.py --task_name long_term_forecast --is_training 1 --root_path ./dataset/ETT-small/ --data_path ETTh1.csv --model_id test_short --model TimesNet --data ETTh1 --features M --seq_len 24 --label_len 12 --pred_len 24 --e_layers 2 --d_layers 1 --d_model 16 --d_ff 32 --enc_in 7 --dec_in 7 --c_out 7 --top_k 5 --train_epochs 1 --num_workers 2

# 3. Imputation
python -u run.py --task_name imputation --is_training 1 --root_path ./dataset/ETT-small/ --data_path ETTh1.csv --model_id test_imp --model TimesNet --data ETTh1 --features M --seq_len 96 --e_layers 2 --d_layers 1 --d_model 16 --d_ff 32 --enc_in 7 --dec_in 7 --c_out 7 --top_k 3 --train_epochs 1 --num_workers 2 --label_len 0 --pred_len 0 --mask_rate 0.125 --learning_rate 0.001

# 4. Anomaly detection
python -u run.py --task_name anomaly_detection --is_training 1 --root_path ./dataset/PSM --model_id test_ad --model TimesNet --data PSM --features M --seq_len 100 --pred_len 0 --d_model 64 --d_ff 64 --e_layers 2 --enc_in 25 --c_out 25 --anomaly_ratio 1.0 --top_k 3 --train_epochs 1 --batch_size 128 --num_workers 2

# 5. Classification
python -u run.py --task_name classification --is_training 1 --root_path ./dataset/Heartbeat/ --model_id Heartbeat --model TimesNet --data UEA --e_layers 2 --d_layers 1 --factor 3 --d_model 64 --d_ff 128 --top_k 3 --train_epochs 1 --batch_size 16 --learning_rate 0.001 --num_workers 0

Train and Evaluate

We provide the experiment scripts for all benchmarks under the folder ./scripts/. You can reproduce the experiment results as the following examples:

# long-term forecast
bash ./scripts/long_term_forecast/ETT_script/TimesNet_ETTh1.sh
# short-term forecast
bash ./scripts/short_term_forecast/TimesNet_M4.sh
# imputation
bash ./scripts/imputation/ETT_script/TimesNet_ETTh1.sh
# anomaly detection
bash ./scripts/anomaly_detection/PSM/TimesNet.sh
# classification
bash ./scripts/classification/TimesNet.sh

Develop Your Own Model

  • Add the model file to the folder ./models. You can follow the ./models/Transformer.py.
  • Include the newly added model in the Exp_Basic.model_dict of ./exp/exp_basic.py.
  • Create the corresponding scripts under the folder ./scripts.

Note:

(1) About classification: Since we include all five tasks in a unified code base, the accuracy of each subtask may fluctuate but the average performance can be reproduced (even a bit better). We have provided the reproduced checkpoints here.

(2) About anomaly detection: Some discussion about the adjustment strategy in anomaly detection can be found here. The key point is that the adjustment strategy corresponds to an event-level metric.

Inspect the project structure:

Time-Series-Library/
├── README.md                     # Official README with tasks, leaderboard, usage
├── requirements.txt              # pip dependency list for quick environment setup
├── LICENSE / CONTRIBUTING.md     # Upstream license and contribution guide
├── run.py                        # Unified entry that parses args and dispatches tasks
├── exp/                          # Task pipelines wrapping train/val/test
│   ├── exp_basic.py              # Experiment base class, registers models, builds flows
│   ├── exp_long_term_forecasting.py    # Long-term forecasting logic
│   ├── exp_short_term_forecasting.py   # Short-term forecasting logic
│   ├── exp_imputation.py               # Missing-value imputation
│   ├── exp_anomaly_detection.py        # Anomaly detection
│   ├── exp_classification.py           # Classification
│   └── exp_zero_shot_forecasting.py    # LTSM zero-shot evaluation
├── data_provider/                # Dataset loaders and splits
│   ├── data_factory.py           # Chooses the proper DataLoader per task
│   ├── data_loader.py            # Generic TS reader with sliding-window logic
│   ├── uea.py / m4.py            # Parsers for UEA, M4 and other formats
│   └── __init__.py               # Exposes factory interfaces upward
├── models/                       # All model implementations
│   ├── TimesNet.py, TimeMixer.py # Main forecasting models
│   ├── Chronos2.py, TiRex.py     # LTSM zero-shot models
│   └── __init__.py               # Enables name-based instantiation inside exp
├── layers/                       # Reusable attention / conv / embedding blocks
│   ├── Transformer_EncDec.py     # Transformer stacks
│   ├── AutoCorrelation.py        # Auto-correlation operator
│   ├── MultiWaveletCorrelation.py# Frequency-domain unit
│   └── Embed.py etc.             # Shared primitives
├── utils/                        # Utility toolbox
│   ├── metrics.py                # MSE / MAE / DTW and other metrics
│   ├── tools.py                  # General helpers such as EarlyStopping
│   ├── augmentation.py           # Augmentations for classification / detection
│   ├── print_args.py             # Unified argument printer
│   └── masking.py / losses.py    # Task-specific helpers
├── scripts/                      # Bash recipes for reproducible experiments
│   ├── long_term_forecast/       # Long-term forecasting per dataset/model
│   ├── short_term_forecast/      # M4 and other short-term scripts
│   ├── imputation/               # Imputation scripts
│   ├── anomaly_detection/        # SMD / SMAP / SWAT detection scripts
│   ├── classification/           # UEA classification scripts
│   └── exogenous_forecast/       # TimeXer exogenous forecasting flow
├── tutorial/                     # TimesNet tutorial notebook and figures
└── pic/                          # README figures (dataset overview, etc.)

Understand the project architecture:

  • E2E flow: configure experiments via scripts/*.sh → run python run.py ...run.py parses arguments and selects the proper Exp_* via task_name → the experiment builds datasets through data_provider, instantiates networks from models, and drives train/val/test with utilities in utils → metrics and checkpoints are written to ./checkpoints.
  • Experiment layer (exp/): Exp_Basic registers models and devices; subclasses implement _get_data, train, and test to encapsulate task-specific differences so the same model can be reused.
  • Model & layer layer (models/ + layers/): model files define architectures, while reusable attention/conv/frequency components live in layers/ to minimize duplication.
  • Data layer (data_provider/): data_factory returns the correct Dataset/DataLoader; data_loader handles windowing, masking, and sampling, with arguments controlling window length, missing ratio, anomaly ratio, etc.
  • Script layer (scripts/): bash scripts capture paper configurations (dataset, window, model, GPU) for reproducibility and serve as templates for custom runs.
  • Utility layer (utils/): metrics centralizes evaluation, tools bundles essentials like EarlyStopping and adjust_learning_rate, while augmentation/masking cover task-specific preprocessing.
  • Learning path: recommended reading order is scripts -> run.py -> exp/exp_basic.py -> corresponding Exp subclass -> data_provider -> models, using tutorial/TimesNet_tutorial.ipynb as a guided walkthrough before diving deeper.

Citation

If you find this repo useful, please cite our paper.

@inproceedings{wu2023timesnet,
  title={TimesNet: Temporal 2D-Variation Modeling for General Time Series Analysis},
  author={Haixu Wu and Tengge Hu and Yong Liu and Hang Zhou and Jianmin Wang and Mingsheng Long},
  booktitle={International Conference on Learning Representations},
  year={2023},
}

@article{wang2024tssurvey,
  title={Deep Time Series Models: A Comprehensive Survey and Benchmark},
  author={Yuxuan Wang and Haixu Wu and Jiaxiang Dong and Yong Liu and Mingsheng Long and Jianmin Wang},
  booktitle={arXiv preprint arXiv:2407.13278},
  year={2024},
}

Contact

If you have any questions or suggestions, feel free to contact our maintenance team:

Current:

Previous:

Or describe it in Issues.

Acknowledgement

This library is constructed based on the following repos:

All the experiment datasets are public, and we obtain them from the following links:

All Thanks To Our Contributors

About

A Library for Advanced Deep Time Series Models for General Time Series Analysis.

Topics

Resources

License

Contributing

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published