March 6, 2024, 5:44 a.m. | Pierre Brugiere, Gabriel Turinici

stat.ML updates on arXiv.org arxiv.org

arXiv:2403.02523v1 Announce Type: cross
Abstract: The transformer models have been extensively used with good results in a wide area of machine learning applications including Large Language Models and image generation. Here, we inquire on the applicability of this approach to financial time series. We first describe the dataset construction for two prototypical situations: a mean reverting synthetic Ornstein-Uhlenbeck process on one hand and real S&P500 data on the other hand. Then, we present in detail the proposed Transformer architecture and …

abstract application applications arxiv construction cs.ai dataset financial good image image generation language language models large language large language models machine machine learning machine learning applications q-fin.pm q-fin.st results series stat.ml time series transformer transformer models type

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Senior Data Scientist

@ ITE Management | New York City, United States