March 5, 2024, 2:51 p.m. | Chiyu ZhangMusic, Honglong CaiMusic, YuezhangMusic, Li, Yuexin Wu, Le Hou, Muhammad Abdul-Mageed

cs.CL updates on arXiv.org arxiv.org

arXiv:2403.01106v1 Announce Type: new
Abstract: Text Style Transfer (TST) seeks to alter the style of text while retaining its core content. Given the constraints of limited parallel datasets for TST, we propose CoTeX, a framework that leverages large language models (LLMs) alongside chain-of-thought (CoT) prompting to facilitate TST. CoTeX distills the complex rewriting and reasoning capabilities of LLMs into more streamlined models capable of working with both non-parallel and parallel data. Through experimentation across four TST datasets, CoTeX is shown …

abstract arxiv constraints core cs.ai cs.cl datasets framework language language models large language large language models llms prompting style style transfer text text style transfer thought transfer type

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Lead Data Modeler

@ Sherwin-Williams | Cleveland, OH, United States