all AI news
Reusing Softmax Hardware Unit for GELU Computation in Transformers
Feb. 16, 2024, 5:43 a.m. | Christodoulos Peltekis, Kosmas Alexandridi, Giorgos Dimitrakopoulos
cs.LG updates on arXiv.org arxiv.org
Abstract: Transformers have improved drastically the performance of natural language processing (NLP) and computer vision applications. The computation of transformers involves matrix multiplications and non-linear activation functions such as softmax and GELU (Gaussion Error Linear Unit) that are accelerated directly in hardware. Currently, function evaluation is done separately for each function and rarely allows for hardware reuse. To mitigate this problem, in this work, we map the computation of GELU to a softmax operator. In this …
abstract applications arxiv computation computer computer vision cs.ar cs.lg error evaluation function functions hardware language language processing linear matrix natural natural language natural language processing nlp non-linear performance processing softmax transformers type vision
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
#13721 - Data Engineer - AI Model Testing
@ Qualitest | Miami, Florida, United States
Elasticsearch Administrator
@ ManTech | 201BF - Customer Site, Chantilly, VA