all AI news
SDCL: Self-Distillation Contrastive Learning for Chinese Spell Checking. (arXiv:2210.17168v4 [cs.CL] UPDATED)
Nov. 8, 2022, 2:16 a.m. | Xiaotian Zhang, Hang Yan, Yu Sun, Xipeng Qiu
cs.CL updates on arXiv.org arxiv.org
Due to the ambiguity of homophones, Chinese Spell Checking (CSC) has
widespread applications. Existing systems typically utilize BERT for text
encoding. However, CSC requires the model to account for both phonetic and
graphemic information. To adapt BERT to the CSC task, we propose a token-level
self-distillation contrastive learning method. We employ BERT to encode both
the corrupted and corresponding correct sentence. Then, we use contrastive
learning loss to regularize corrupted tokens' hidden states to be closer to
counterparts in the …
More from arxiv.org / cs.CL updates on arXiv.org
Jobs in AI, ML, Big Data
Founding AI Engineer, Agents
@ Occam AI | New York
AI Engineer Intern, Agents
@ Occam AI | US
AI Research Scientist
@ Vara | Berlin, Germany and Remote
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne