Jan. 10, 2022, 2:10 a.m. | Haixu Wu, Jiehui Xu, Jianmin Wang, Mingsheng Long

cs.LG updates on arXiv.org arxiv.org

Extending the forecasting time is a critical demand for real applications,
such as extreme weather early warning and long-term energy consumption
planning. This paper studies the long-term forecasting problem of time series.
Prior Transformer-based models adopt various self-attention mechanisms to
discover the long-range dependencies. However, intricate temporal patterns of
the long-term future prohibit the model from finding reliable dependencies.
Also, Transformers have to adopt the sparse versions of point-wise
self-attentions for long series efficiency, resulting in the information
utilization bottleneck. …

arxiv correlation forecasting transformers

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Machine Learning Engineer (m/f/d)

@ StepStone Group | Düsseldorf, Germany

2024 GDIA AI/ML Scientist - Supplemental

@ Ford Motor Company | United States