March 25, 2024, 4:42 a.m. | Yitian Zhang, Liheng Ma, Soumyasundar Pal, Yingxue Zhang, Mark Coates

cs.LG updates on arXiv.org arxiv.org

arXiv:2311.04147v2 Announce Type: replace
Abstract: The performance of transformers for time-series forecasting has improved significantly. Recent architectures learn complex temporal patterns by segmenting a time-series into patches and using the patches as tokens. The patch size controls the ability of transformers to learn the temporal patterns at different frequencies: shorter patches are effective for learning localized, high-frequency patterns, whereas mining long-term seasonalities and trends requires longer patches. Inspired by this observation, we propose a novel framework, Multi-resolution Time-Series Transformer (MTST), …

abstract architectures arxiv cs.lg forecasting learn long-term patterns performance resolution series temporal tokens transformer transformers type

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

RL Analytics - Content, Data Science Manager

@ Meta | Burlingame, CA

Research Engineer

@ BASF | Houston, TX, US, 77079