Web: http://arxiv.org/abs/2201.12740

June 17, 2022, 1:11 a.m. | Tian Zhou, Ziqing Ma, Qingsong Wen, Xue Wang, Liang Sun, Rong Jin

cs.LG updates on arXiv.org arxiv.org

Although Transformer-based methods have significantly improved
state-of-the-art results for long-term series forecasting, they are not only
computationally expensive but more importantly, are unable to capture the
global view of time series (e.g. overall trend). To address these problems, we
propose to combine Transformer with the seasonal-trend decomposition method, in
which the decomposition method captures the global profile of time series while
Transformers capture more detailed structures. To further enhance the
performance of Transformer for long-term prediction, we exploit the fact …

arxiv forecasting lg long-term transformer

More from arxiv.org / cs.LG updates on arXiv.org

Machine Learning Researcher - Saalfeld Lab

@ Howard Hughes Medical Institute - Chevy Chase, MD | Ashburn, Virginia

Project Director, Machine Learning in US Health

@ ideas42.org | Remote, US

Data Science Intern

@ NannyML | Remote

Machine Learning Engineer NLP/Speech

@ Play.ht | Remote

Research Scientist, 3D Reconstruction

@ Yembo | Remote, US

Clinical Assistant or Associate Professor of Management Science and Systems

@ University at Buffalo | Buffalo, NY