all AI news
CLMFormer: Mitigating Data Redundancy to Revitalize Transformer-based Long-Term Time Series Forecasting System
March 5, 2024, 2:44 p.m. | Mingjie Li, Rui Liu, Guangsi Shi, Mingfei Han, Changling Li, Lina Yao, Xiaojun Chang, Ling Chen
cs.LG updates on arXiv.org arxiv.org
Abstract: Long-term time-series forecasting (LTSF) plays a crucial role in various practical applications. Transformer and its variants have become the de facto backbone for LTSF, offering exceptional capabilities in processing long sequence data. However, existing Transformer-based models, such as Fedformer and Informer, often achieve their best performances on validation sets after just a few epochs, indicating potential underutilization of the Transformer's capacity. One of the reasons that contribute to this overfitting is data redundancy arising from …
abstract applications arxiv become capabilities cs.cv cs.lg data data redundancy forecasting long-term practical processing redundancy role series time series time series forecasting transformer type variants
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
AI Engineer Intern, Agents
@ Occam AI | US
AI Research Scientist
@ Vara | Berlin, Germany and Remote
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Lead Data Modeler
@ Sherwin-Williams | Cleveland, OH, United States