Feb. 19, 2024, 5:43 a.m. | Guillermo Iglesias, Edgar Talavera, \'Angel Gonz\'alez-Prieto, Alberto Mozo, Sandra G\'omez-Canaval

cs.LG updates on arXiv.org arxiv.org

arXiv:2206.13508v4 Announce Type: replace
Abstract: With the latest advances in Deep Learning-based generative models, it has not taken long to take advantage of their remarkable performance in the area of time series. Deep neural networks used to work with time series heavily depend on the size and consistency of the datasets used in training. These features are not usually abundant in the real world, where they are usually limited and often have constraints that must be guaranteed. Therefore, an effective …

abstract advances arxiv augmentation cs.ai cs.lg data deep learning domain generative generative models networks neural networks performance series survey taxonomy time series type work

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

AI Engineering Manager

@ M47 Labs | Barcelona, Catalunya [Cataluña], Spain