March 14, 2024, 4:42 a.m. | Marcus H\"aggbom, Morten Karlsmark, Joakim and\'en

cs.LG updates on arXiv.org arxiv.org

arXiv:2403.08362v1 Announce Type: cross
Abstract: Microcanonical gradient descent is a sampling procedure for energy-based models allowing for efficient sampling of distributions in high dimension. It works by transporting samples from a high-entropy distribution, such as Gaussian white noise, to a low-energy region using gradient descent. We put this model in the framework of normalizing flows, showing how it can often overfit by losing an unnecessary amount of entropy in the descent. As a remedy, we propose a mean-field microcanonical gradient …

abstract arxiv cs.lg distribution energy entropy framework gradient low low-energy mean noise q-fin.st samples sampling stat.co stat.ml type white noise

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Data Engineer (m/f/d)

@ Project A Ventures | Berlin, Germany

Principle Research Scientist

@ Analog Devices | US, MA, Boston