all AI news
Learning to Generate Text in Arbitrary Writing Styles
March 5, 2024, 2:53 p.m. | Aleem Khan, Andrew Wang, Sophia Hager, Nicholas Andrews
cs.CL updates on arXiv.org arxiv.org
Abstract: Prior work in style-controlled text generation has focused on tasks such as emulating the style of prolific literary authors, producing formal or informal text, and mitigating toxicity of generated text. Plentiful demonstrations of these styles are available, and as a result modern language models are often able to emulate them, either via prompting or discriminative control. However, in applications such as writing assistants, it is desirable for language models to produce text in an author-specific …
abstract arxiv authors controlled text generation cs.cl generate generated language language models modern prior prolific style tasks text text generation toxicity type work writing
More from arxiv.org / cs.CL updates on arXiv.org
Jobs in AI, ML, Big Data
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Principal Data Engineering Manager
@ Microsoft | Redmond, Washington, United States
Machine Learning Engineer
@ Apple | San Diego, California, United States