Web: http://arxiv.org/abs/2209.11055

Sept. 23, 2022, 1:16 a.m. | Lewis Tunstall, Nils Reimers, Unso Eun Seo Jo, Luke Bates, Daniel Korat, Moshe Wasserblat, Oren Pereg

cs.CL updates on arXiv.org arxiv.org

Recent few-shot methods, such as parameter-efficient fine-tuning (PEFT) and
pattern exploiting training (PET), have achieved impressive results in
label-scarce settings. However, they are difficult to employ since they are
subject to high variability from manually crafted prompts, and typically
require billion-parameter language models to achieve high accuracy. To address
these shortcomings, we propose SetFit (Sentence Transformer Fine-tuning), an
efficient and prompt-free framework for few-shot fine-tuning of Sentence
Transformers (ST). SetFit works by first fine-tuning a pretrained ST on a small …

arxiv few-shot learning

More from arxiv.org / cs.CL updates on arXiv.org

Postdoctoral Fellow: ML for autonomous materials discovery

@ Lawrence Berkeley National Lab | Berkeley, CA

Research Scientists

@ ODU Research Foundation | Norfolk, Virginia

Embedded Systems Engineer (Robotics)

@ Neo Cybernetica | Bedford, New Hampshire

2023 Luis J. Alvarez and Admiral Grace M. Hopper Postdoc Fellowship in Computing Sciences

@ Lawrence Berkeley National Lab | San Francisco, CA

Senior Manager Data Scientist

@ NAV | Remote, US

Senior AI Research Scientist

@ Earth Species Project | Remote anywhere

Research Fellow- Center for Security and Emerging Technology (Multiple Opportunities)

@ University of California Davis | Washington, DC

Staff Fellow - Data Scientist

@ U.S. FDA/Center for Devices and Radiological Health | Silver Spring, Maryland

Staff Fellow - Senior Data Engineer

@ U.S. FDA/Center for Devices and Radiological Health | Silver Spring, Maryland

Research Engineer - VFX, Neural Compositing

@ Flawless | Los Angeles, California, United States

[Job-TB] Senior Data Engineer

@ CI&T | Brazil

Data Analytics Engineer

@ The Fork | Paris, France