July 8, 2022, 1:11 a.m. | Tracy Qian, Andy Xie, Camille Bruckmann

cs.CL updates on arXiv.org arxiv.org

The explosion in novel NLP word embedding and deep learning techniques has
induced significant endeavors into potential applications. One of these
directions is in the financial sector. Although there is a lot of work done in
state-of-the-art models like GPT and BERT, there are relatively few works on
how well these methods perform through fine-tuning after being pre-trained, as
well as info on how sensitive their parameters are. We investigate the
performance and sensitivity of transferred neural architectures from
pre-trained …

analysis arxiv bert financial gpt gpt-2 neural architectures sentiment sentiment analysis

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Business Data Analyst

@ Alstom | Johannesburg, GT, ZA