all AI news
Control Prefixes for Parameter-Efficient Text Generation. (arXiv:2110.08329v2 [cs.CL] UPDATED)
Web: http://arxiv.org/abs/2110.08329
May 11, 2022, 1:11 a.m. | Jordan Clive, Kris Cao, Marek Rei
cs.CL updates on arXiv.org arxiv.org
Prefix-tuning is a powerful lightweight technique for adapting a large
pre-trained language model to a downstream application. However, it uses the
same dataset-level tuned prompt for all examples in the dataset. We extend this
idea and propose a dynamic method, Control Prefixes, which allows for the
inclusion of conditional input-dependent information, combining the benefits of
prompt tuning and controlled generation. The method incorporates
attribute-level learnable representations into different layers of a
pre-trained transformer, allowing for the generated text to be …
More from arxiv.org / cs.CL updates on arXiv.org
Latest AI/ML/Big Data Jobs
Data & Insights Strategy & Innovation General Manager
@ Chevron Services Company, a division of Chevron U.S.A Inc. | Houston, TX
Faculty members in Research areas such as Bayesian and Spatial Statistics; Data Privacy and Security; AI/ML; NLP; Image and Video Data Analysis
@ Ahmedabad University | Ahmedabad, India
Director, Applied Mathematics & Computational Research Division
@ Lawrence Berkeley National Lab | Berkeley, Ca
Business Data Analyst
@ MainStreet Family Care | Birmingham, AL
Assistant/Associate Professor of the Practice in Business Analytics
@ Georgetown University McDonough School of Business | Washington DC
Senior Data Science Writer
@ NannyML | Remote