all AI news
RWKV: Reinventing RNNs for the Transformer Era (Paper Explained)
June 2, 2023, 10:38 p.m. | Yannic Kilcher
Yannic Kilcher www.youtube.com
We take a look at RWKV, a highly scalable architecture between Transformers and RNNs.
Fully Connected (June 7th in SF) Promo Link: https://www.fullyconnected.com/?promo=ynnc
OUTLINE:
0:00 - Introduction
1:50 - Fully Connected In-Person Conference in SF June 7th
3:00 - Transformers vs RNNs
8:00 - RWKV: Best of both worlds
12:30 - LSTMs
17:15 - Evolution of RWKV's Linear Attention
30:40 - RWKV's Layer Structure
49:15 - Time-Parallel vs Sequence Mode
53:55 - Experimental Results & Limitations
58:00 …
architecture best of conference evolution explained gpt4 introduction look paper person scalable transformer transformers
More from www.youtube.com / Yannic Kilcher
Jobs in AI, ML, Big Data
Founding AI Engineer, Agents
@ Occam AI | New York
AI Engineer Intern, Agents
@ Occam AI | US
AI Research Scientist
@ Vara | Berlin, Germany and Remote
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Consultant Senior Power BI & Azure - CDI - H/F
@ Talan | Lyon, France