all AI news
Length Extrapolation of Transformers: A Survey from the Perspective of Position Encoding
April 2, 2024, 7:52 p.m. | Liang Zhao, Xiaocheng Feng, Xiachong Feng, Bing Qin, Ting Liu
cs.CL updates on arXiv.org arxiv.org
Abstract: Transformer has taken the field of natural language processing (NLP) by storm since its birth. Further, Large language models (LLMs) built upon it have captured worldwide attention due to its superior abilities. Nevertheless, all Transformer-based models including these powerful LLMs suffer from a preset length limit and can hardly generalize from short training sequences to longer inference ones, namely, they can not perform length extrapolation. Hence, a plethora of methods have been proposed to enhance …
abstract arxiv attention birth cs.cl encoding language language models language processing large language large language models llms natural natural language natural language processing nlp perspective processing storm survey transformer transformer-based models transformers type
More from arxiv.org / cs.CL updates on arXiv.org
Jobs in AI, ML, Big Data
Data Engineer
@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania
Artificial Intelligence – Bioinformatic Expert
@ University of Texas Medical Branch | Galveston, TX
Lead Developer (AI)
@ Cere Network | San Francisco, US
Research Engineer
@ Allora Labs | Remote
Ecosystem Manager
@ Allora Labs | Remote
Founding AI Engineer, Agents
@ Occam AI | New York