SP
BravenNow
Timer-S1: A Billion-Scale Time Series Foundation Model with Serial Scaling
| USA | technology | ✓ Verified - arxiv.org

Timer-S1: A Billion-Scale Time Series Foundation Model with Serial Scaling

#Timer-S1 #foundation model #time series #serial scaling #billion-scale #forecasting #AI #sequential data

📌 Key Takeaways

  • Timer-S1 is a billion-scale foundation model designed for time series data.
  • It introduces a novel 'serial scaling' approach to enhance model performance.
  • The model aims to improve accuracy and efficiency in time series forecasting tasks.
  • It represents a significant advancement in scalable AI for sequential data analysis.

📖 Full Retelling

arXiv:2603.04791v1 Announce Type: new Abstract: We introduce Timer-S1, a strong Mixture-of-Experts (MoE) time series foundation model with 8.3B total parameters, 0.75B activated parameters for each token, and a context length of 11.5K. To overcome the scalability bottleneck in existing pre-trained time series foundation models, we perform Serial Scaling in three dimensions: model architecture, dataset, and training pipeline. Timer-S1 integrates sparse TimeMoE blocks and generic TimeSTP blocks f

🏷️ Themes

AI Scaling, Time Series

📚 Related People & Topics

Artificial intelligence

Artificial intelligence

Intelligence of machines

# Artificial Intelligence (AI) **Artificial Intelligence (AI)** is a specialized field of computer science dedicated to the development and study of computational systems capable of performing tasks typically associated with human intelligence. These tasks include learning, reasoning, problem-solvi...

View Profile → Wikipedia ↗

Entity Intersection Graph

Connections for Artificial intelligence:

🏢 OpenAI 14 shared
🌐 Reinforcement learning 4 shared
🏢 Anthropic 4 shared
🌐 Large language model 3 shared
🏢 Nvidia 3 shared
View full profile

Mentioned Entities

Artificial intelligence

Artificial intelligence

Intelligence of machines

}
Original Source
--> Computer Science > Artificial Intelligence arXiv:2603.04791 [Submitted on 5 Mar 2026] Title: Timer-S1: A Billion-Scale Time Series Foundation Model with Serial Scaling Authors: Yong Liu , Xingjian Su , Shiyu Wang , Haoran Zhang , Haixuan Liu , Yuxuan Wang , Zhou Ye , Yang Xiang , Jianmin Wang , Mingsheng Long View a PDF of the paper titled Timer-S1: A Billion-Scale Time Series Foundation Model with Serial Scaling, by Yong Liu and 9 other authors View PDF HTML Abstract: We introduce Timer-S1, a strong Mixture-of-Experts time series foundation model with 8.3B total parameters, 0.75B activated parameters for each token, and a context length of 11.5K. To overcome the scalability bottleneck in existing pre-trained time series foundation models, we perform Serial Scaling in three dimensions: model architecture, dataset, and training pipeline. Timer-S1 integrates sparse TimeMoE blocks and generic TimeSTP blocks for Serial-Token Prediction , a generic training objective that adheres to the serial nature of forecasting. The proposed paradigm introduces serial computations to improve long-term predictions while avoiding costly rolling-style inference and pronounced error accumulation in the standard next-token prediction. Pursuing a high-quality and unbiased training dataset, we curate TimeBench, a corpus with one trillion time points, and apply meticulous data augmentation to mitigate predictive bias. We further pioneer a post-training stage, including continued pre-training and long-context extension, to enhance short-term and long-context performance. Evaluated on the large-scale GIFT-Eval leaderboard, Timer-S1 achieves state-of-the-art forecasting performance, attaining the best MASE and CRPS scores as a pre-trained model. Timer-S1 will be released to facilitate further research. Subjects: Artificial Intelligence (cs.AI) Cite as: arXiv:2603.04791 [cs.AI] (or arXiv:2603.04791v1 [cs.AI] for this version) https://doi.org/10.48550/arXiv.2603.04791 Focus to learn more arXi...
Read full article at source

Source

arxiv.org

More from USA

News from Other Countries

🇬🇧 United Kingdom

🇺🇦 Ukraine