Feb. 19, 2024, 4:41 p.m. | /u/Kaldnite

Machine Learning www.reddit.com

I've been doing some reading about Mixture of Experts (MoE) models, and how they penalise the models to ensure that the distribution of activations are equal across all X experts.

Now that being said, is it reasonable to say that it isn't a "This is the Maths expert, and that's the Science expert", but rather a black box of optimised sub-models trained on lots training data to target different dimensions of the input query?

I'm viewing it more as a …

distribution expert experts isn machinelearning maths mixture of experts moe reading

Software Engineer for AI Training Data (School Specific)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Python)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Tier 2)

@ G2i Inc | Remote

Data Engineer

@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US