all AI news
Scheduled Multi-task Learning for Neural Chat Translation. (arXiv:2205.03766v2 [cs.CL] UPDATED)
May 11, 2022, 1:11 a.m. | Yunlong Liang, Fandong Meng, Jinan Xu, Yufeng Chen, Jie Zhou
cs.CL updates on arXiv.org arxiv.org
Neural Chat Translation (NCT) aims to translate conversational text into
different languages. Existing methods mainly focus on modeling the bilingual
dialogue characteristics (e.g., coherence) to improve chat translation via
multi-task learning on small-scale chat translation data. Although the NCT
models have achieved impressive success, it is still far from satisfactory due
to insufficient chat translation data and simple joint training manners. To
address the above issues, we propose a scheduled multi-task learning framework
for NCT. Specifically, we devise a three-stage …
More from arxiv.org / cs.CL updates on arXiv.org
VAL: Interactive Task Learning with GPT Dialog Parsing
1 day, 12 hours ago |
arxiv.org
DBCopilot: Scaling Natural Language Querying to Massive Databases
1 day, 12 hours ago |
arxiv.org
Jobs in AI, ML, Big Data
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Staff Software Engineer, Generative AI, Google Cloud AI
@ Google | Mountain View, CA, USA; Sunnyvale, CA, USA
Expert Data Sciences
@ Gainwell Technologies | Any city, CO, US, 99999