all AI news
Is attention required for ICL? Exploring the Relationship Between Model Architecture and In-Context Learning Ability
April 3, 2024, 4:43 a.m. | Ivan Lee, Nan Jiang, Taylor Berg-Kirkpatrick
cs.LG updates on arXiv.org arxiv.org
Abstract: What is the relationship between model architecture and the ability to perform in-context learning? In this empirical study, we take the first steps toward answering this question. We evaluate thirteen model architectures capable of causal language modeling across a suite of synthetic in-context learning tasks. These selected architectures represent a broad range of paradigms, including recurrent and convolution-based neural networks, transformers, state space model inspired, and other emerging attention alternatives. We discover that all the …
abstract architecture architectures arxiv attention causal context cs.lg in-context learning language modeling question relationship study type
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
Lead Developer (AI)
@ Cere Network | San Francisco, US
Research Engineer
@ Allora Labs | Remote
Ecosystem Manager
@ Allora Labs | Remote
Founding AI Engineer, Agents
@ Occam AI | New York
AI Engineer Intern, Agents
@ Occam AI | US
AI Research Scientist
@ Vara | Berlin, Germany and Remote