all AI news
CoCon: A Self-Supervised Approach for Controlled Text Generation. (arXiv:2006.03535v3 [cs.CL] UPDATED)
June 13, 2022, 1:12 a.m. | Alvin Chan, Yew-Soon Ong, Bill Pung, Aston Zhang, Jie Fu
cs.CL updates on arXiv.org arxiv.org
Pretrained Transformer-based language models (LMs) display remarkable natural
language generation capabilities. With their immense potential, controlling
text generation of such LMs is getting attention. While there are studies that
seek to control high-level attributes (such as sentiment and topic) of
generated text, there is still a lack of more precise control over its content
at the word- and phrase-level. Here, we propose Content-Conditioner (CoCon) to
control an LM's output text with a content input, at a fine-grained level. In
our …
arxiv controlled text generation generation text text generation
More from arxiv.org / cs.CL updates on arXiv.org
Jobs in AI, ML, Big Data
AI Research Scientist
@ Vara | Berlin, Germany and Remote
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Business Data Analyst
@ Alstom | Johannesburg, GT, ZA