Web: http://arxiv.org/abs/2206.08598

June 20, 2022, 1:10 a.m. | Pascal Mattia Esser, Frank Nielsen

cs.LG updates on arXiv.org arxiv.org

A common way to learn and analyze statistical models is to consider
operations in the model parameter space. But what happens if we optimize in the
parameter space and there is no one-to-one mapping between the parameter space
and the underlying statistical model space? Such cases frequently occur for
hierarchical models which include statistical mixtures or stochastic neural
networks, and these models are said to be singular. Singular models reveal
several important and well-studied problems in machine learning like the …

arxiv dynamics influence learning lg model models on

More from arxiv.org / cs.LG updates on arXiv.org

Machine Learning Researcher - Saalfeld Lab

@ Howard Hughes Medical Institute - Chevy Chase, MD | Ashburn, Virginia

Project Director, Machine Learning in US Health

@ ideas42.org | Remote, US

Data Science Intern

@ NannyML | Remote

Machine Learning Engineer NLP/Speech

@ Play.ht | Remote

Research Scientist, 3D Reconstruction

@ Yembo | Remote, US

Clinical Assistant or Associate Professor of Management Science and Systems

@ University at Buffalo | Buffalo, NY