Jan. 20, 2022, 2:10 a.m. | Veronika Arefieva, Roman Egger

cs.LG updates on arXiv.org arxiv.org

The Bidirectional Encoder Representations from Transformers (BERT) is
currently one of the most important and state-of-the-art models for natural
language. However, it has also been shown that for domain-specific tasks it is
helpful to pretrain BERT on a domain-specific corpus. In this paper, we present
TourBERT, a pretrained language model for tourism. We describe how TourBERT was
developed and evaluated. The evaluations show that TourBERT is outperforming
BERT in all tourism-specific tasks.

arxiv industry language language model

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Data Management Assistant

@ World Vision | Amman Office, Jordan

Cloud Data Engineer, Global Services Delivery, Google Cloud

@ Google | Buenos Aires, Argentina