all AI news
Multi-resolution Time-Series Transformer for Long-term Forecasting
March 25, 2024, 4:42 a.m. | Yitian Zhang, Liheng Ma, Soumyasundar Pal, Yingxue Zhang, Mark Coates
cs.LG updates on arXiv.org arxiv.org
Abstract: The performance of transformers for time-series forecasting has improved significantly. Recent architectures learn complex temporal patterns by segmenting a time-series into patches and using the patches as tokens. The patch size controls the ability of transformers to learn the temporal patterns at different frequencies: shorter patches are effective for learning localized, high-frequency patterns, whereas mining long-term seasonalities and trends requires longer patches. Inspired by this observation, we propose a novel framework, Multi-resolution Time-Series Transformer (MTST), …
abstract architectures arxiv cs.lg forecasting learn long-term patterns performance resolution series temporal tokens transformer transformers type
More from arxiv.org / cs.LG updates on arXiv.org
Digital Over-the-Air Federated Learning in Multi-Antenna Systems
2 days, 13 hours ago |
arxiv.org
Bagging Provides Assumption-free Stability
2 days, 13 hours ago |
arxiv.org
Jobs in AI, ML, Big Data
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
RL Analytics - Content, Data Science Manager
@ Meta | Burlingame, CA
Research Engineer
@ BASF | Houston, TX, US, 77079