all AI news
Transformers are Provably Optimal In-context Estimators for Wireless Communications
June 18, 2024, 4:50 a.m. | Vishnu Teja Kunde, Vicram Rajagopalan, Chandra Shekhara Kaushik Valmeekam, Krishna Narayanan, Srinivas Shakkottai, Dileep Kalathil, Jean-Francois Cham
cs.LG updates on arXiv.org arxiv.org
Abstract: Pre-trained transformers exhibit the capability of adapting to new tasks through in-context learning (ICL), where they efficiently utilize a limited set of prompts without explicit model optimization.
The canonical communication problem of estimating transmitted symbols from received observations can be modelled as an in-context learning problem: Received observations are essentially a noisy function of transmitted symbols, and this function can be represented by an unknown parameter whose statistics depend on an (also unknown) latent context. …
abstract arxiv canonical capability communication communications context context learning cs.lg eess.sp in-context learning model optimization optimization problem prompts replace set tasks through transformers type wireless wireless communications
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
AI Focused Biochemistry Postdoctoral Fellow
@ Lawrence Berkeley National Lab | Berkeley, CA
Senior Data Engineer
@ Displate | Warsaw
Staff Software Engineer (Data Platform)
@ Phaidra | Remote
Distributed Compute Engineer
@ Magic | San Francisco
Power Platform Developer/Consultant
@ Euromonitor | Bengaluru, Karnataka, India
Finance Project Senior Manager
@ QIMA | London, United Kingdom