Web: http://arxiv.org/abs/2205.02910

May 9, 2022, 1:11 a.m. | Yu-Jui Huang, Yuchong Zhang

cs.LG updates on arXiv.org arxiv.org

This paper approaches the unsupervised learning problem by gradient descent
in the space of probability density functions. Our main result shows that along
the gradient flow induced by a distribution-dependent ordinary differential
equation (ODE), the unknown data distribution emerges as the long-time limit of
this flow of densities. That is, one can uncover the data distribution by
simulating the distribution-dependent ODE. Intriguingly, we find that the
simulation of the ODE is equivalent to the training of generative adversarial
networks (GANs). …

arxiv gans gradient

More from arxiv.org / cs.LG updates on arXiv.org

Director, Applied Mathematics & Computational Research Division

@ Lawrence Berkeley National Lab | Berkeley, Ca

Business Data Analyst

@ MainStreet Family Care | Birmingham, AL

Assistant/Associate Professor of the Practice in Business Analytics

@ Georgetown University McDonough School of Business | Washington DC

Senior Data Science Writer

@ NannyML | Remote

Director of AI/ML Engineering

@ Armis Industries | Remote (US only), St. Louis, California

Digital Analytics Manager

@ Patagonia | Ventura, California