all AI news
[R] RWKV-2 430M release (a parallelizable RNN with transformer-level LM performance, and without using attention)
Web: https://www.reddit.com/r/MachineLearning/comments/veem7o/r_rwkv2_430m_release_a_parallelizable_rnn_with/
June 17, 2022, 2:35 p.m. | /u/bo_peng
Machine Learning reddit.com
[https://www.reddit.com/r/MachineLearning/comments/umq908/r\_rwkvv2rnn\_a\_parallelizable\_rnn\_with/](https://www.reddit.com/r/MachineLearning/comments/umq908/r_rwkvv2rnn_a_parallelizable_rnn_with/)
And I have finished the training of a RWKV-2 430M (L24-D1024) on the Pile. It's confirmed that a pure RNN without attention can reach transformer-level LM performance:
https://preview.redd.it/6756ax5wz6691.png?width=992&format=png&auto=webp&s=70d5b52fb43fca1a7d304832f6cbd082bfe3f9c5
The maths behind RWKV-2:
https://preview.redd.it/17eniof007691.png?width=662&format=png&auto=webp&s=f37ed4dd14409269952b421d18a315b8cd343e21
You can download the params & fine-tuning code here:
[https://github.com/BlinkDL/RWKV-v2-RNN-Pile](https://github.com/BlinkDL/RWKV-v2-RNN-Pile)
​
Now I am training a RWKV-2 1.5B (L24-D2048) which is expected to finish in 2 months :)
[https://wandb.ai/blinkdl/RWKV-v2-RNN-Pile](https://wandb.ai/blinkdl/RWKV-v2-RNN-Pile)
More from reddit.com / Machine Learning
Latest AI/ML/Big Data Jobs
Machine Learning Researcher - Saalfeld Lab
@ Howard Hughes Medical Institute - Chevy Chase, MD | Ashburn, Virginia
Project Director, Machine Learning in US Health
@ ideas42.org | Remote, US
Data Science Intern
@ NannyML | Remote
Machine Learning Engineer NLP/Speech
@ Play.ht | Remote
Research Scientist, 3D Reconstruction
@ Yembo | Remote, US
Clinical Assistant or Associate Professor of Management Science and Systems
@ University at Buffalo | Buffalo, NY