March 23, 2024, 10:35 a.m. | /u/SunsetOneSix

Machine Learning www.reddit.com

**Paper**: [https://arxiv.org/abs/2402.02622](https://arxiv.org/abs/2402.02622)

**Code**: [https://github.com/epfml/DenseFormer](https://github.com/epfml/DenseFormer)

**Abstract**:

>The transformer architecture by Vaswani et al. (2017) is now ubiquitous across application domains, from natural language processing to speech processing and image understanding. We propose **DenseFormer**, a simple modification to the standard architecture that improves the perplexity of the model without increasing its size -- adding a few thousand parameters for large-scale models in the 100B parameters range. Our approach relies on an additional averaging step after each transformer block, which computes a weighted …

abstract application architecture domains image language language processing large-scale models machinelearning natural natural language natural language processing parameters perplexity processing scale simple speech speech processing standard transformer transformer architecture understanding

Lead Developer (AI)

@ Cere Network | San Francisco, US

Research Engineer

@ Allora Labs | Remote

Ecosystem Manager

@ Allora Labs | Remote

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote