Aug. 26, 2022, 1:11 a.m. | Amin Shabani, Amir Abdi, Lili Meng, Tristan Sylvain

cs.LG updates on arXiv.org arxiv.org

The performance of time series forecasting has recently been greatly improved
by the introduction of transformers. In this paper, we propose a general
multi-scale framework that can be applied to state-of-the-art transformer-based
time series forecasting models including Autoformer and Informer. Using
iteratively refining a forecasted time series at multiple scales with shared
weights, architecture adaptations and a specially-designed normalization
scheme, we are able to achieve significant performance improvements with
minimal additional computational overhead. Via detailed ablation studies, we
demonstrate the …

arxiv forecasting iterative lg scale series time time series time series forecasting transformers

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US

Research Engineer

@ Allora Labs | Remote

Ecosystem Manager

@ Allora Labs | Remote

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US