all AI news
Scaleformer: Iterative Multi-scale Refining Transformers for Time Series Forecasting. (arXiv:2206.04038v2 [cs.LG] UPDATED)
cs.LG updates on arXiv.org arxiv.org
The performance of time series forecasting has recently been greatly improved
by the introduction of transformers. In this paper, we propose a general
multi-scale framework that can be applied to state-of-the-art transformer-based
time series forecasting models including Autoformer and Informer. Using
iteratively refining a forecasted time series at multiple scales with shared
weights, architecture adaptations and a specially-designed normalization
scheme, we are able to achieve significant performance improvements with
minimal additional computational overhead. Via detailed ablation studies, we
demonstrate the …
arxiv forecasting iterative lg scale series time time series time series forecasting transformers