Feb. 15, 2024, 5:46 a.m. | Man Luo, Xin Xu, Yue Liu, Panupong Pasupat, Mehran Kazemi

cs.CL updates on arXiv.org arxiv.org

arXiv:2401.11624v3 Announce Type: replace
Abstract: Language models, especially pre-trained large language models, have showcased remarkable abilities as few-shot in-context learners (ICL), adept at adapting to new tasks with just a few demonstrations in the input context. However, the model's ability to perform ICL is sensitive to the choice of the few-shot demonstrations. Instead of using a fixed set of demonstrations, one recent development is to retrieve demonstrations tailored to each input query. The implementation of demonstration retrieval is relatively straightforward, …

abstract adept arxiv context cs.ai cs.cl cs.ir few-shot in-context learning language language models large language large language models survey tasks type

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US

Research Engineer

@ Allora Labs | Remote

Ecosystem Manager

@ Allora Labs | Remote

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US