May 23, 2022, 1:11 a.m. | Jake Grigsby, Zhe Wang, Yanjun Qi

stat.ML updates on arXiv.org arxiv.org

Multivariate Time Series Forecasting focuses on the prediction of future
values based on historical context. State-of-the-art sequence-to-sequence
models rely on neural attention between timesteps, which allows for temporal
learning but fails to consider distinct spatial relationships between
variables. In contrast, methods based on graph neural networks explicitly model
variable relationships. However, these methods often rely on predefined graphs
and perform separate spatial and temporal updates without establishing direct
connections between each variable at every timestep. This paper addresses these
problems …

arxiv forecasting transformers

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Senior Business Intelligence Developer / Analyst

@ Transamerica | Work From Home, USA

Data Analyst (All Levels)

@ Noblis | Bethesda, MD, United States