all AI news
VarMAE: Pre-training of Variational Masked Autoencoder for Domain-adaptive Language Understanding. (arXiv:2211.00430v1 [cs.CL])
cs.CL updates on arXiv.org arxiv.org
Pre-trained language models have achieved promising performance on general
benchmarks, but underperform when migrated to a specific domain. Recent works
perform pre-training from scratch or continual pre-training on domain corpora.
However, in many specific domains, the limited corpus can hardly support
obtaining precise representations. To address this issue, we propose a novel
Transformer-based language model named VarMAE for domain-adaptive language
understanding. Under the masked autoencoding objective, we design a context
uncertainty learning module to encode the token's context into a …
arxiv autoencoder language masked autoencoder pre-training training understanding