all AI news
[R] DenseFormer: Enhancing Information Flow in Transformers via Depth Weighted Averaging
March 23, 2024, 10:35 a.m. | /u/SunsetOneSix
Machine Learning www.reddit.com
**Code**: [https://github.com/epfml/DenseFormer](https://github.com/epfml/DenseFormer)
**Abstract**:
>The transformer architecture by Vaswani et al. (2017) is now ubiquitous across application domains, from natural language processing to speech processing and image understanding. We propose **DenseFormer**, a simple modification to the standard architecture that improves the perplexity of the model without increasing its size -- adding a few thousand parameters for large-scale models in the 100B parameters range. Our approach relies on an additional averaging step after each transformer block, which computes a weighted …
abstract application architecture domains image language language processing large-scale models machinelearning natural natural language natural language processing parameters perplexity processing scale simple speech speech processing standard transformer transformer architecture understanding
More from www.reddit.com / Machine Learning
Jobs in AI, ML, Big Data
Senior Machine Learning Engineer
@ GPTZero | Toronto, Canada
ML/AI Engineer / NLP Expert - Custom LLM Development (x/f/m)
@ HelloBetter | Remote
Doctoral Researcher (m/f/div) in Automated Processing of Bioimages
@ Leibniz Institute for Natural Product Research and Infection Biology (Leibniz-HKI) | Jena
Seeking Developers and Engineers for AI T-Shirt Generator Project
@ Chevon Hicks | Remote
Principal Data Architect - Azure & Big Data
@ MGM Resorts International | Home Office - US, NV
GN SONG MT Market Research Data Analyst 11
@ Accenture | Bengaluru, BDC7A