Web: http://arxiv.org/abs/2206.08873

June 20, 2022, 1:11 a.m. | Pierre-Cyril Aubin-Frankowski, Anna Korba, Flavien Léger

cs.LG updates on arXiv.org arxiv.org

Many problems in machine learning can be formulated as optimizing a convex
functional over a space of measures. This paper studies the convergence of the
mirror descent algorithm in this infinite-dimensional setting. Defining Bregman
divergences through directional derivatives, we derive the convergence of the
scheme for relatively smooth and strongly convex pairs of functionals. Applying
our result to joint distributions and the Kullback--Leibler (KL) divergence, we
show that Sinkhorn's primal iterations for entropic optimal transport in the
continuous setting correspond …

application arxiv math

More from arxiv.org / cs.LG updates on arXiv.org

Machine Learning Researcher - Saalfeld Lab

@ Howard Hughes Medical Institute - Chevy Chase, MD | Ashburn, Virginia

Project Director, Machine Learning in US Health

@ ideas42.org | Remote, US

Data Science Intern

@ NannyML | Remote

Machine Learning Engineer NLP/Speech

@ Play.ht | Remote

Research Scientist, 3D Reconstruction

@ Yembo | Remote, US

Clinical Assistant or Associate Professor of Management Science and Systems

@ University at Buffalo | Buffalo, NY