April 2, 2024, 7:52 p.m. | Alisa Liu, Xiaochuang Han, Yizhong Wang, Yulia Tsvetkov, Yejin Choi, Noah A. Smith

cs.CL updates on arXiv.org arxiv.org

arXiv:2401.08565v2 Announce Type: replace
Abstract: Despite the general capabilities of large pretrained language models, they consistently benefit from further adaptation to better achieve desired behaviors. However, tuning these models has become increasingly resource-intensive, or impossible when model weights are private. We introduce proxy-tuning, a lightweight decoding-time algorithm that operates on top of black-box LMs to achieve the same end as direct tuning, but by accessing only its predictions over the output vocabulary, not its parameters. Our method tunes a smaller …

abstract algorithm arxiv become benefit box capabilities cs.cl decoding general however language language models lms type

Data Scientist (m/f/x/d)

@ Symanto Research GmbH & Co. KG | Spain, Germany

Data Scientist 3

@ Wyetech | Annapolis Junction, Maryland

Technical Program Manager, Robotics

@ DeepMind | Mountain View, California, US

Machine Learning Engineer

@ Issuu | Braga

Business Intelligence Manager

@ Intuitive | Bengaluru, India

Expert Data Engineer (m/w/d)

@ REWE International Dienstleistungsgesellschaft m.b.H | Wien, Austria