all AI news
Towards Robust and Semantically Organised Latent Representations for Unsupervised Text Style Transfer. (arXiv:2205.02309v1 [cs.CL])
May 6, 2022, 1:10 a.m. | Sharan Narasimhan, Suvodip Dey, Maunendra Sankar Desarkar
cs.CL updates on arXiv.org arxiv.org
Recent studies show that auto-encoder based approaches successfully perform
language generation, smooth sentence interpolation, and style transfer over
unseen attributes using unlabelled datasets in a zero-shot manner. The latent
space geometry of such models is organised well enough to perform on datasets
where the style is "coarse-grained" i.e. a small fraction of words alone in a
sentence are enough to determine the overall style label. A recent study uses a
discrete token-based perturbation approach to map "similar" sentences
("similar" defined …
arxiv style transfer text text style transfer transfer unsupervised
More from arxiv.org / cs.CL updates on arXiv.org
Jobs in AI, ML, Big Data
AI Engineer Intern, Agents
@ Occam AI | US
AI Research Scientist
@ Vara | Berlin, Germany and Remote
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Lead Data Modeler
@ Sherwin-Williams | Cleveland, OH, United States