Web: http://arxiv.org/abs/2209.06853

Sept. 16, 2022, 1:11 a.m. | Xinwei Shen, Kani Chen, Tong Zhang

cs.LG updates on arXiv.org arxiv.org

Generative Adversarial Networks (GANs) have achieved great success in data
generation. However, its statistical properties are not fully understood. In
this paper, we consider the statistical behavior of the general $f$-divergence
formulation of GAN, which includes the Kullback--Leibler divergence that is
closely related to the maximum likelihood principle. We show that for
parametric generative models that are correctly specified, all $f$-divergence
GANs with the same discriminator classes are asymptotically equivalent under
suitable regularity conditions. Moreover, with an appropriately chosen local …

analysis arxiv divergence gan math statistical

More from arxiv.org / cs.LG updates on arXiv.org

Research Scientists

@ ODU Research Foundation | Norfolk, Virginia

Embedded Systems Engineer (Robotics)

@ Neo Cybernetica | Bedford, New Hampshire

2023 Luis J. Alvarez and Admiral Grace M. Hopper Postdoc Fellowship in Computing Sciences

@ Lawrence Berkeley National Lab | San Francisco, CA

Senior Manager Data Scientist

@ NAV | Remote, US

Senior AI Research Scientist

@ Earth Species Project | Remote anywhere

Research Fellow- Center for Security and Emerging Technology (Multiple Opportunities)

@ University of California Davis | Washington, DC

Staff Fellow - Data Scientist

@ U.S. FDA/Center for Devices and Radiological Health | Silver Spring, Maryland

Staff Fellow - Senior Data Engineer

@ U.S. FDA/Center for Devices and Radiological Health | Silver Spring, Maryland

Senior Research Engineer, Applied Language

@ DeepMind | Mountain View, California, US

Machine Learning Engineer

@ Bluevine | Austin, TX

Lead Manager - Analytics & Data Science

@ Tide | India(Remote)

Machine Learning Engineer

@ Gtmhub | Indore, Madhya Pradesh, India