Aug. 31, 2022, 1:10 a.m. | Espen Haugsdal, Erlend Aune, Massimiliano Ruocco

cs.LG updates on arXiv.org arxiv.org

Time series forecasting is an important problem, with many real world
applications. Ensembles of deep neural networks have recently achieved
impressive forecasting accuracy, but such large ensembles are impractical in
many real world settings. Transformer models been successfully applied to a
diverse set of challenging problems. We propose a novel adaptation of the
original Transformer architecture focusing on the task of time series
forecasting, called Persistence Initialization. The model is initialized as a
naive persistence model by using a multiplicative …

architecture arxiv forecasting series time time series time series forecasting transformer transformer architecture

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US

Research Engineer

@ Allora Labs | Remote

Ecosystem Manager

@ Allora Labs | Remote

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US