LLM Reasoning 相关度: 6/10

Baguan-TS: A Sequence-Native In-Context Learning Model for Time Series Forecasting with Covariates

Linxiao Yang, Xue Jiang, Gezheng Xu, Tian Zhou, Min Yang, ZhaoYang Zhu, Linyuan Geng, Zhipeng Zeng, Qiming Chen, Xinyue Gu, Rong Jin, Liang Sun
arXiv: 2603.17439v1 发布: 2026-03-18 更新: 2026-03-18

AI 摘要

Baguan-TS利用3D Transformer和上下文学习,提升了带协变量的时间序列预测性能。

主要贡献

  • 提出Baguan-TS模型,融合序列表示学习和上下文学习
  • 引入目标空间检索的局部校准方法,提升模型稳定性和准确性
  • 提出上下文过拟合策略,缓解输出过度平滑问题

方法论

构建3D Transformer,联合关注时间、变量和上下文轴。采用检索校准和上下文过拟合策略优化模型。

原文摘要

Transformers enable in-context learning (ICL) for rapid, gradient-free adaptation in time series forecasting, yet most ICL-style approaches rely on tabularized, hand-crafted features, while end-to-end sequence models lack inference-time adaptation. We bridge this gap with a unified framework, Baguan-TS, which integrates the raw-sequence representation learning with ICL, instantiated by a 3D Transformer that attends jointly over temporal, variable, and context axes. To make this high-capacity model practical, we tackle two key hurdles: (i) calibration and training stability, improved with a feature-agnostic, target-space retrieval-based local calibration; and (ii) output oversmoothing, mitigated via context-overfitting strategy. On public benchmark with covariates, Baguan-TS consistently outperforms established baselines, achieving the highest win rate and significant reductions in both point and probabilistic forecasting metrics. Further evaluations across diverse real-world energy datasets demonstrate its robustness, yielding substantial improvements.

标签

时间序列预测 Transformer 上下文学习 协变量

arXiv 分类

cs.LG cs.AI