April 2, 2024, 7:43 p.m. | Yuan Gao, Jian Huang, Yuling Jiao, Shurong Zheng

cs.LG updates on arXiv.org arxiv.org

arXiv:2404.00551v1 Announce Type: cross
Abstract: Continuous normalizing flows (CNFs) are a generative method for learning probability distributions, which is based on ordinary differential equations. This method has shown remarkable empirical success across various applications, including large-scale image synthesis, protein structure prediction, and molecule generation. In this work, we study the theoretical properties of CNFs with linear interpolation in learning probability distributions from a finite random sample, using a flow matching objective function. We establish non-asymptotic error bounds for the distribution …

abstract applications arxiv continuous continuous normalizing flows convergence cs.lg differential generative image ordinary prediction probability protein protein structure protein structure prediction scale stat.ml study success synthesis type work

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Associate Data Engineer

@ Nominet | Oxford/ Hybrid, GB

Data Science Senior Associate

@ JPMorgan Chase & Co. | Bengaluru, Karnataka, India