all AI news
[R] DenseFormer: Enhancing Information Flow in Transformers via Depth Weighted Averaging
March 23, 2024, 10:35 a.m. | /u/SunsetOneSix
Machine Learning www.reddit.com
**Code**: [https://github.com/epfml/DenseFormer](https://github.com/epfml/DenseFormer)
**Abstract**:
>The transformer architecture by Vaswani et al. (2017) is now ubiquitous across application domains, from natural language processing to speech processing and image understanding. We propose **DenseFormer**, a simple modification to the standard architecture that improves the perplexity of the model without increasing its size -- adding a few thousand parameters for large-scale models in the 100B parameters range. Our approach relies on an additional averaging step after each transformer block, which computes a weighted …
abstract application architecture domains image language language processing large-scale models machinelearning natural natural language natural language processing parameters perplexity processing scale simple speech speech processing standard transformer transformer architecture understanding
More from www.reddit.com / Machine Learning
[D] Recognizing uncommon terms with whisper
1 day, 1 hour ago |
www.reddit.com
[D] Is EOS token crucial during pre-training?
1 day, 1 hour ago |
www.reddit.com
Jobs in AI, ML, Big Data
Lead Developer (AI)
@ Cere Network | San Francisco, US
Research Engineer
@ Allora Labs | Remote
Ecosystem Manager
@ Allora Labs | Remote
Founding AI Engineer, Agents
@ Occam AI | New York
AI Engineer Intern, Agents
@ Occam AI | US
AI Research Scientist
@ Vara | Berlin, Germany and Remote