all AI news
ChunkFormer: Learning Long Time Series with Multi-stage Chunked Transformer. (arXiv:2112.15087v1 [cs.LG])
Jan. 3, 2022, 2:10 a.m. | Yue Ju, Alka Isac, Yimin Nie
cs.LG updates on arXiv.org arxiv.org
The analysis of long sequence data remains challenging in many real-world
applications. We propose a novel architecture, ChunkFormer, that improves the
existing Transformer framework to handle the challenges while dealing with long
time series. Original Transformer-based models adopt an attention mechanism to
discover global information along a sequence to leverage the contextual data.
Long sequential data traps local information such as seasonality and
fluctuations in short data sequences. In addition, the original Transformer
consumes more resources by carrying the entire …
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Data Analytics & Insight Specialist, Customer Success
@ Fortinet | Ottawa, ON, Canada
Account Director, ChatGPT Enterprise - Majors
@ OpenAI | Remote - Paris