Web: http://arxiv.org/abs/2109.13582

May 5, 2022, 1:11 a.m. | Antoine Chaffin, Vincent Claveau, Ewa Kijak

cs.CL updates on arXiv.org arxiv.org

Large language models (LM) based on Transformers allow to generate plausible
long texts. In this paper, we explore how this generation can be further
controlled at decoding time to satisfy certain constraints (e.g. being
non-toxic, conveying certain emotions, using a specific writing style, etc.)
without fine-tuning the LM. Precisely, we formalize constrained generation as a
tree exploration process guided by a discriminator that indicates how well the
associated sequence respects the constraint. This approach, in addition to
being easier and …


Data Analyst, Patagonia Action Works

@ Patagonia | Remote

Data & Insights Strategy & Innovation General Manager

@ Chevron Services Company, a division of Chevron U.S.A Inc. | Houston, TX

Faculty members in Research areas such as Bayesian and Spatial Statistics; Data Privacy and Security; AI/ML; NLP; Image and Video Data Analysis

@ Ahmedabad University | Ahmedabad, India

Director, Applied Mathematics & Computational Research Division

@ Lawrence Berkeley National Lab | Berkeley, Ca

Business Data Analyst

@ MainStreet Family Care | Birmingham, AL

Assistant/Associate Professor of the Practice in Business Analytics

@ Georgetown University McDonough School of Business | Washington DC