all AI news
Long-Range Transformers for Dynamic Spatiotemporal Forecasting. (arXiv:2109.12218v2 [cs.LG] UPDATED)
May 23, 2022, 1:11 a.m. | Jake Grigsby, Zhe Wang, Yanjun Qi
stat.ML updates on arXiv.org arxiv.org
Multivariate Time Series Forecasting focuses on the prediction of future
values based on historical context. State-of-the-art sequence-to-sequence
models rely on neural attention between timesteps, which allows for temporal
learning but fails to consider distinct spatial relationships between
variables. In contrast, methods based on graph neural networks explicitly model
variable relationships. However, these methods often rely on predefined graphs
and perform separate spatial and temporal updates without establishing direct
connections between each variable at every timestep. This paper addresses these
problems …
More from arxiv.org / stat.ML updates on arXiv.org
Estimation Sample Complexity of a Class of Nonlinear Continuous-time Systems
2 days, 20 hours ago |
arxiv.org
Jobs in AI, ML, Big Data
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Senior Business Intelligence Developer / Analyst
@ Transamerica | Work From Home, USA
Data Analyst (All Levels)
@ Noblis | Bethesda, MD, United States