June 6, 2022, 1:11 a.m. | Juliette Millet, Charlotte Caucheteux, Pierre Orhan, Yves Boubenec, Alexandre Gramfort, Ewan Dunbar, Christophe Pallier, Jean-Remi King

cs.CL updates on arXiv.org arxiv.org

Several deep neural networks have recently been shown to generate activations
similar to those of the brain in response to the same input. These algorithms,
however, remain largely implausible: they require (1) extraordinarily large
amounts of data, (2) unobtainable supervised labels, (3) textual rather than
raw sensory input, and / or (4) implausibly large memory (e.g. thousands of
contextual words). These elements highlight the need to identify algorithms
that, under these limitations, would suffice to account for both behavioral and …

arxiv bio brain learning processing self-supervised learning speech speech processing supervised learning

Lead Developer (AI)

@ Cere Network | San Francisco, US

Research Engineer

@ Allora Labs | Remote

Ecosystem Manager

@ Allora Labs | Remote

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote