Oct. 28, 2022, 1:14 a.m. | Szymon Nowakowski, Piotr Pokarowski, Wojciech Rejchel

stat.ML updates on arXiv.org arxiv.org

Sparse modelling or model selection with categorical data is challenging even
for a moderate number of variables, because one parameter is roughly needed to
encode one category or level. The Group Lasso is a well known efficient
algorithm for selection continuous or categorical variables, but all estimates
related to a selected factor usually differ. Therefore, a fitted model may not
be sparse, which makes the model interpretation difficult. To obtain a sparse
solution of the Group Lasso we propose the …

arxiv data lasso

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

DevOps Engineer (Data Team)

@ Reward Gateway | Sofia/Plovdiv