all AI news
Improving Group Lasso for high-dimensional categorical data. (arXiv:2210.14021v2 [stat.ME] UPDATED)
Oct. 28, 2022, 1:14 a.m. | Szymon Nowakowski, Piotr Pokarowski, Wojciech Rejchel
stat.ML updates on arXiv.org arxiv.org
Sparse modelling or model selection with categorical data is challenging even
for a moderate number of variables, because one parameter is roughly needed to
encode one category or level. The Group Lasso is a well known efficient
algorithm for selection continuous or categorical variables, but all estimates
related to a selected factor usually differ. Therefore, a fitted model may not
be sparse, which makes the model interpretation difficult. To obtain a sparse
solution of the Group Lasso we propose the …
More from arxiv.org / stat.ML updates on arXiv.org
Learning linear dynamical systems under convex constraints
3 days, 7 hours ago |
arxiv.org
Inverse Unscented Kalman Filter
4 days, 7 hours ago |
arxiv.org
Jobs in AI, ML, Big Data
Founding AI Engineer, Agents
@ Occam AI | New York
AI Engineer Intern, Agents
@ Occam AI | US
AI Research Scientist
@ Vara | Berlin, Germany and Remote
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
DevOps Engineer (Data Team)
@ Reward Gateway | Sofia/Plovdiv