Web: http://arxiv.org/abs/2009.10622

May 13, 2022, 1:10 a.m. | TrungTin Nguyen, Hien D Nguyen, Faicel Chamroukhi, Geoffrey J McLachlan

stat.ML updates on arXiv.org arxiv.org

Mixture-of-experts (MoE) models are a popular framework for modeling
heterogeneity in data, for both regression and classification problems in
statistics and machine learning, due to their flexibility and the abundance of
available statistical estimation and model choice tools. Such flexibility comes
from allowing the mixture weights (or gating functions) in the MoE model to
depend on the explanatory variables, along with the experts (or component
densities). This permits the modeling of data arising from more complex data
generating processes when …

arxiv inequality lasso math models oracle regression

More from arxiv.org / stat.ML updates on arXiv.org

Director, Applied Mathematics & Computational Research Division

@ Lawrence Berkeley National Lab | Berkeley, Ca

Business Data Analyst

@ MainStreet Family Care | Birmingham, AL

Assistant/Associate Professor of the Practice in Business Analytics

@ Georgetown University McDonough School of Business | Washington DC

Senior Data Science Writer

@ NannyML | Remote

Director of AI/ML Engineering

@ Armis Industries | Remote (US only), St. Louis, California

Digital Analytics Manager

@ Patagonia | Ventura, California