Time Series Large Models are foundational models specifically designed for time series data analysis. The IoTDB team has been developing the Timer, a self-researched foundational time series model, which is based on the Transformer architecture and pre-trained on massive multi-domain time series data, supporting downstream tasks such as time series forecasting, anomaly detection, and time series imputation. The AINode platform developed by the team also supports the integration of cutting-edge time series foundational models from the industry, providing users with diverse model options. Unlike traditional time series analysis techniques, these large models possess universal feature extraction capabilities and can serve a wide range of analytical tasks through zero-shot analysis, fine-tuning, and other services.
All technical achievements in the field of time series large models related to this paper (including both the team's self-researched models and industry-leading directions) have been published in top international machine learning conferences, with specific details in the appendix.
The Timer model (non-built-in model) not only demonstrates excellent few-shot generalization and multi-task adaptability, but also acquires a rich knowledge base through pre-training, endowing it with universal capabilities to handle diverse downstream tasks, with the following characteristics:
Timer-XL further extends and upgrades the network structure based on Timer, achieving comprehensive breakthroughs in multiple dimensions:
Timer-Sundial is a series of generative foundational models focused on time series forecasting. The base version has 128 million parameters and has been pre-trained on 1 trillion time points, with the following core characteristics:
Chronos-2 is a universal time series foundational model developed by the Amazon Web Services (AWS) research team, evolved from the Chronos discrete token modeling paradigm. This model is suitable for both zero-shot univariate forecasting and covariate forecasting. Its main characteristics include:
Time Series Large Models can adapt to real time series data from various different domains and scenarios, demonstrating excellent processing capabilities across various tasks. The following shows the actual performance on different datasets:
Time Series Forecasting:
Leveraging the forecasting capabilities of Time Series Large Models, future trends of time series can be accurately predicted. The blue curve in the following figure represents the predicted trend, while the red curve represents the actual trend, with both curves highly consistent.
Data Imputation:
Using Time Series Large Models to fill missing data segments through predictive imputation.
Anomaly Detection:
Using Time Series Large Models to accurately identify outliers that deviate significantly from the normal trend.
IoTDB> show cluster +------+----------+-------+---------------+------------+--------------+-----------+ |NodeID| NodeType| Status|InternalAddress|InternalPort| Version| BuildInfo| +------+----------+-------+---------------+------------+--------------+-----------+ | 0|ConfigNode|Running| 127.0.0.1| 10710| 2.0.5.1| 069354f| | 1| DataNode|Running| 127.0.0.1| 10730| 2.0.5.1| 069354f| | 2| AINode|Running| 127.0.0.1| 10810| 2.0.5.1|069354f-dev| +------+----------+-------+---------------+------------+--------------+-----------+ Total line number = 3 It costs 0.140s
In an online environment, the first startup of the AINode node will automatically pull the Timer-XL, Sundial, and Chronos2 models.
Note:
- The AINode installation package does not include model weight files.
- The automatic pull feature depends on the deployment environment having HuggingFace network access capability.
- AINode supports manual upload of model weight files. For specific operation methods, refer to Importing Weight Files.
Check if the models are available.
IoTDB> show models +---------------------+---------+--------+--------+ | ModelId|ModelType|Category| State| +---------------------+---------+--------+--------+ | arima| sktime| builtin| active| | holtwinters| sktime| builtin| active| |exponential_smoothing| sktime| builtin| active| | naive_forecaster| sktime| builtin| active| | stl_forecaster| sktime| builtin| active| | gaussian_hmm| sktime| builtin| active| | gmm_hmm| sktime| builtin| active| | stray| sktime| builtin| active| | timer_xl| timer| builtin| active| | sundial| sundial| builtin| active| | chronos2| t5| builtin| active| +---------------------+---------+--------+--------+
[1] Timer: Generative Pre-trained Transformers Are Large Time Series Models, Yong Liu, Haoran Zhang, Chenyu Li, Xiangdong Huang, Jianmin Wang, Mingsheng Long. ↩ Back
[2] TIMER-XL: LONG-CONTEXT TRANSFORMERS FOR UNIFIED TIME SERIES FORECASTING, Yong Liu, Guo Qin, Xiangdong Huang, Jianmin Wang, Mingsheng Long. ↩ Back
[3] Sundial: A Family of Highly Capable Time Series Foundation Models, Yong Liu, Guo Qin, Zhiyuan Shi, Zhi Chen, Caiyin Yang, Xiangdong Huang, Jianmin Wang, Mingsheng Long, ICML 2025 spotlight. ↩ Back
[4] Chronos-2: From Univariate to Universal Forecasting, Abdul Fatir Ansari, Oleksandr Shchur, Jaris Küken, Andreas Auer, Boran Han, Pedro Mercado, Syama Sundar Rangapuram, Huibin Shen, Lorenzo Stella, Xiyuan Zhang, Mononito Goswami, Shubham Kapoor, Danielle C. Maddix, Pablo Guerron, Tony Hu, Junming Yin, Nick Erickson, Prateek Mutalik Desai, Hao Wang, Huzefa Rangwala, George Karypis, Yuyang Wang, Michael Bohlke-Schneider, arXiv:2510.15821. ↩ Back