PyOmniTS

skill
Guvenlik Denetimi
Gecti
Health Gecti
  • License Ò€” License: MIT
  • Description Ò€” Repository has a description
  • Active repo Ò€” Last push 0 days ago
  • Community trust Ò€” 59 GitHub stars
Code Gecti
  • Code scan Ò€” Scanned 12 files during light audit, no dangerous patterns found
Permissions Gecti
  • Permissions Ò€” No dangerous permissions requested
Purpose
This is a researcher and agent-friendly Python framework designed for highly extensible time series analysis. It allows users to seamlessly train almost any model on any dataset without worrying about breaking existing code.

Security Assessment
Overall Risk: Low
The automated code scan of 12 files found no dangerous patterns, and the tool does not request any risky system permissions. Based on the repository details, there is no evidence of hardcoded secrets, unwanted shell command execution, or unauthorized access to sensitive local data. Standard network requests are limited to expected machine learning operations, such as downloading public datasets or interacting with Hugging Face spaces.

Quality Assessment
The project demonstrates strong quality indicators and excellent maintenance health, with its last code push occurring just today. It is officially backed by a legitimate academic pedigree, serving as the repository for papers accepted at major conferences like ICLR 2026 and ICML 2025. It is properly licensed under the permissive and standard MIT license, allowing for broad usage and modification. With nearly 60 GitHub stars, it shows early but solid community trust and engagement from the data science field.

Verdict
Safe to use.
SUMMARY

πŸ”¬ A Researcher&Agent-Friendly Framework for Time Series Analysis. Train Any Model on Any Dataset!

README.md

A Researcher&Agent-Friendly Framework for Time Series Analysis.

Train Any Model on Any Dataset.

πŸ“Š Time series analysis leaderboard is now available on our πŸ€— Hugging Face space. Discover the performance of different models!


This is also the official repository for the following paper:

  • Learning Recursive Multi-Scale Representations for Irregular Multivariate Time Series Forecasting (ICLR 2026) [OpenReview] [arXiv]

    @inproceedings{li_LearningRecursiveMultiScale_2026,
        author = {Li, Boyuan and Liu, Zhen and Luo, Yicheng  and Ma, Qianli},
        booktitle = {International Conference on Learning Representations},
        title = {Learning Recursive Multi-Scale Representations for Irregular Multivariate Time Series Forecasting},
        year = {2026}
    }
    
  • HyperIMTS: Hypergraph Neural Network for Irregular Multivariate Time Series Forecasting (ICML 2025) [poster] [OpenReview] [arXiv]

    @inproceedings{li_HyperIMTSHypergraphNeural_2025,
        author = {Li, Boyuan and Luo, Yicheng and Liu, Zhen and Zheng, Junhao and Lv, Jianming and Ma, Qianli},
        booktitle = {Forty-Second International Conference on Machine Learning},
        title = {HyperIMTS: Hypergraph Neural Network for Irregular Multivariate Time Series Forecasting},
        year = {2025}
    }
    

1. ✨ Hightlighted Features

  • Extensibility: Adapt your model/dataset once, train almost any combination of "model" $\times$ "dataset" $\times$ "loss function".
  • Compatibility: Accept models with any number/type of arguments in forward; Accept datasets with any number/type of return values in __getitem__; Accept tailored loss calculation for specific models.
  • Maintainability: No need to worry about breaking the training codes of existing models/datasets/loss functions when adding new ones.
  • Reproducibility: Minimal library dependencies for core components. Try the best to get rid of fancy third-party libraries (e.g., PyTorch Lightning, EasyTorch).
  • Efficiency: Multi-GPU parallel training; Python built-in logger; structured experimental result saving (json)...
  • Transferability: Even if you don't like our framework, you can still easily find and copy the models/datasets you want. No overwhelming encapsulation.

2. 🧭 Documentation

Checkout the new documentation website.

Using 🦞agent? Check out our official PyOmniTS skill on clawhub. Your agent will understand the essentials of our framework, and even automate the code replication process by adapting other papers' codes into PyOmniTS!

3. πŸ€– Models

51 models, covering regular, irregular, pretrained, and traffic models, have been included in PyOmniTS, and more are coming.

Model classes can be found in models/, and their dependencies can be found in layers/

  • βœ…: supported
  • ❌: not supported
  • '-': not implemented
  • MTS: regularly sampled multivariate time series
  • IMTS: able to handle irregularly sampled multivariate time series
Model Venue Type Forecasting Classification Imputation
Ada-MSHyper NeurIPS 2024 MTS βœ… βœ… βœ…
APN AAAI 2026 IMTS βœ… - -
Autoformer NeurIPS 2021 MTS βœ… βœ… βœ…
Scaleformer ICLR 2023 MTS βœ… - βœ…
BigST VLDB 2024 MTS βœ… βœ… βœ…
Crossformer ICLR 2023 MTS βœ… βœ… βœ…
CRU ICML 2022 IMTS βœ… ❌ βœ…
DLinear AAAI 2023 MTS βœ… βœ… βœ…
ETSformer arXiv 2022 MTS βœ… βœ… βœ…
FEDformer ICML 2022 MTS βœ… βœ… βœ…
FiLM NeurIPS 2022 MTS βœ… βœ… βœ…
FourierGNN NeurIPS 2023 MTS βœ… βœ… βœ…
FreTS NeurIPS 2023 MTS βœ… βœ… βœ…
GNeuralFlow NeurIPS 2024 IMTS βœ… ❌ βœ…
GraFITi AAAI 2024 IMTS βœ… βœ… βœ…
GRU-D Scientific Reports 2018 IMTS βœ… βœ… βœ…
HD-TTS ICML 2024 IMTS βœ… - βœ…
Hi-Patch ICML 2025 IMTS βœ… βœ… βœ…
higp ICML 2024 MTS βœ… βœ… βœ…
HyperIMTS ICML 2025 IMTS βœ… - βœ…
Informer AAAI 2021 MTS βœ… βœ… βœ…
iTransformer ICLR 2024 MTS βœ… βœ… βœ…
Koopa NeurIPS 2023 MTS βœ… ❌ βœ…
Latent_ODE NeurIPS 2019 IMTS βœ… ❌ βœ…
Leddam ICML 2024 MTS βœ… βœ… βœ…
LightTS arXiv 2022 MTS βœ… βœ… βœ…
Mamba Language Modeling 2024 MTS βœ… βœ… βœ…
MICN ICLR 2023 MTS βœ… βœ… βœ…
MOIRAI ICML 2024 Any βœ… - βœ…
mTAN ICLR 2021 IMTS βœ… βœ… βœ…
NeuralFlows NeurIPS 2021 IMTS βœ… ❌ βœ…
NHITS AAAI 2023 MTS βœ… - βœ…
Nonstationary Transformer NeurIPS 2022 MTS βœ… βœ… βœ…
PatchTST ICLR 2023 MTS βœ… βœ… βœ…
Pathformer ICLR 2024 MTS βœ… - βœ…
PrimeNet AAAI 2023 IMTS βœ… βœ… βœ…
Pyraformer ICLR 2022 MTS βœ… βœ… βœ…
Raindrop ICLR 2022 IMTS βœ… βœ… βœ…
Reformer ICLR 2020 MTS βœ… βœ… βœ…
ReIMTS ICLR 2026 IMTS βœ… βœ… -
SeFT ICML 2020 IMTS βœ… βœ… βœ…
SegRNN arXiv 2023 MTS βœ… βœ… βœ…
Temporal Fusion Transformer arXiv 2019 MTS βœ… - -
TiDE TMLR 2023 MTS βœ… βœ… βœ…
TimeCHEAT AAAI 2025 MTS βœ… βœ… βœ…
TimeMixer ICLR 2024 MTS βœ… βœ… βœ…
TimesNet ICLR 2023 MTS βœ… βœ… βœ…
tPatchGNN ICML 2024 IMTS βœ… βœ… βœ…
Transformer NeurIPS 2017 MTS βœ… βœ… βœ…
TSMixer TMLR 2023 MTS βœ… βœ… βœ…
Warpformer KDD 2023 IMTS βœ… βœ… βœ…

4. πŸ’Ύ Datasets

Dataest classes are put in data/data_provider/datasets, and dependencies can be found in data/dependencies:

11 datasets, covering regular and irregular ones, have been included in PyOmniTS, and more are coming.

  • βœ…: supported
  • ❌: not supported
  • '-': not implemented
  • MTS: regularly sampled multivariate time series
  • IMTS: irregularly sampled multivariate time series
Dataset Type Field Forecasting
ECL MTS electricity βœ…
ETTh1 MTS electricity βœ…
ETTm1 MTS electricity βœ…
Human Activity IMTS biomechanics βœ…
ILI MTS healthcare βœ…
MIMIC III IMTS healthcare βœ…
MIMIC IV IMTS healthcare βœ…
PhysioNet'12 IMTS healthcare βœ…
Traffic MTS traffic βœ…
USHCN IMTS weather βœ…
Weather MTS weather βœ…

Datasets for classification and imputation have not released yet.

5. πŸ“‰ Loss Functions

The following loss functions are included under loss_fns/:

Loss Function Task Note
CrossEntropyLoss Classification -
MAE Forecasting/Imputation -
ModelProvidedLoss - Some models prefer to calculate loss within forward(), such as GNeuralFlows.
MSE_Dual Forecasting/Imputation
MSE Forecasting/Imputation -

6. 🚧 Roadmap

PyOmniTS is continously evolving:

  • More tutorials.
  • Classification support in core components.
  • Imputation support in core components.
  • Optional python package management via uv.

Yet Another Code Framework?

We encountered the following problems when using existing ones:

  • Argument & return value chaos for models' forward():

    Different models usually take varying number and shape of arguments, especially ones from different domains.
    Changes to training logic are needed to support these differences.

  • Return value chaos for datasets' __getitem__():

    datasets can return a number of tensors in different shapes, which have to be aligned with arguments of models' forward() one by one.
    Changes to training logic are also needed to support these differences.

  • Argument & return value chaos for loss functions' forward():

    loss functions take different types of tensors as input, require aligning with return values from models' forward().

  • Overwhelming dependencies:

    some existing pipelines use fancy high-level packages in building the pipeline, which can lower the flexibility of code modification.

Contributors

Ladbaby
Ladbaby

πŸ’» πŸ›

Acknowledgement

Yorumlar (0)

Sonuc bulunamadi