April 25, 2024, 7:42 p.m. | Aobo Liang, Xingguo Jiang, Yan Sun, Chang Lu

cs.LG updates on arXiv.org arxiv.org

arXiv:2404.15772v1 Announce Type: new
Abstract: Long-term time series forecasting (LTSF) provides longer insights into future trends and patterns. In recent years, deep learning models especially Transformers have achieved advanced performance in LTSF tasks. However, the quadratic complexity of Transformers rises the challenge of balancing computaional efficiency and predicting performance. Recently, a new state space model (SSM) named Mamba is proposed. With the selective capability on input data and the hardware-aware parallel computing algorithm, Mamba can well capture long-term dependencies while …

abstract advanced arxiv challenge complexity cs.lg deep learning efficiency forecasting future however insights long-term mamba patterns performance series state tasks time series time series forecasting transformers trends type

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Consultant - Artificial Intelligence & Data (Google Cloud Data Engineer) - MY / TH

@ Deloitte | Kuala Lumpur, MY