all AI news
[D] Is there MoE implemented for less than 1B total parameters?
April 16, 2024, 9:48 a.m. | /u/FrozenWolf-Cyber
Machine Learning www.reddit.com
1) Which is the currently widely used well performing MoE (Is it sparse moe?)
2) How does DeepSpeeds MoE fit in this picture? How well does it perform relative to others? Does any recent performant architectures adopt it?
3) Has anyone tried Sparse MoE for encoder-decoder model, say like for flan t5? Does it work similarly well? If so Why hasn't it been popular like deocder only variants?
4) Has anyone tried MoE for …
architectures machinelearning moe parameters questions total
More from www.reddit.com / Machine Learning
Jobs in AI, ML, Big Data
AI Research Scientist
@ Vara | Berlin, Germany and Remote
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Senior Software Engineer, Generative AI (C++)
@ SoundHound Inc. | Toronto, Canada