May 16, 2022, 1:11 a.m. | Julian von der Mosel, Alexander Trautsch, Steffen Herbold

cs.LG updates on arXiv.org arxiv.org

Transformers are the current state-of-the-art of natural language processing
in many domains and are using traction within software engineering research as
well. Such models are pre-trained on large amounts of data, usually from the
general domain. However, we only have a limited understanding regarding the
validity of transformers within the software engineering domain, i.e., how good
such models are at understanding words and sentences within a software
engineering context and how this improves the state-of-the-art. Within this
article, we shed …

arxiv engineering language language processing natural natural language natural language processing processing software software engineering transformers

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Data Management Assistant

@ World Vision | Amman Office, Jordan

Cloud Data Engineer, Global Services Delivery, Google Cloud

@ Google | Buenos Aires, Argentina