Web: http://arxiv.org/abs/2104.02640

May 13, 2022, 1:11 a.m. | TrungTin Nguyen, Hien Duy Nguyen, Faicel Chamroukhi, Florence Forbes

cs.LG updates on arXiv.org arxiv.org

Mixture of experts (MoE) are a popular class of statistical and machine
learning models that have gained attention over the years due to their
flexibility and efficiency. In this work, we consider Gaussian-gated localized
MoE (GLoME) and block-diagonal covariance localized MoE (BLoME) regression
models to present nonlinear relationships in heterogeneous data with potential
hidden graph-structured interactions between high-dimensional predictors. These
models pose difficult statistical estimation and model selection questions,
both from a computational and theoretical perspective. This paper is devoted …

arxiv math mixture of experts model models model selection

More from arxiv.org / cs.LG updates on arXiv.org

Data Analyst, Patagonia Action Works

@ Patagonia | Remote

Data & Insights Strategy & Innovation General Manager

@ Chevron Services Company, a division of Chevron U.S.A Inc. | Houston, TX

Faculty members in Research areas such as Bayesian and Spatial Statistics; Data Privacy and Security; AI/ML; NLP; Image and Video Data Analysis

@ Ahmedabad University | Ahmedabad, India

Director, Applied Mathematics & Computational Research Division

@ Lawrence Berkeley National Lab | Berkeley, Ca

Business Data Analyst

@ MainStreet Family Care | Birmingham, AL

Assistant/Associate Professor of the Practice in Business Analytics

@ Georgetown University McDonough School of Business | Washington DC