all AI news
Generalization in diffusion models arises from geometry-adaptive harmonic representations
March 19, 2024, 4:45 a.m. | Zahra Kadkhodaie, Florentin Guth, Eero P. Simoncelli, St\'ephane Mallat
cs.LG updates on arXiv.org arxiv.org
Abstract: Deep neural networks (DNNs) trained for image denoising are able to generate high-quality samples with score-based reverse diffusion algorithms. These impressive capabilities seem to imply an escape from the curse of dimensionality, but recent reports of memorization of the training set raise the question of whether these networks are learning the "true" continuous density of the data. Here, we show that two DNNs trained on non-overlapping subsets of a dataset learn nearly the same score …
abstract algorithms arxiv capabilities cs.cv cs.lg denoising diffusion diffusion models dimensionality generate geometry image imply networks neural networks quality question raise reports samples set the curse of dimensionality training type
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
Lead Developer (AI)
@ Cere Network | San Francisco, US
Research Engineer
@ Allora Labs | Remote
Ecosystem Manager
@ Allora Labs | Remote
Founding AI Engineer, Agents
@ Occam AI | New York
AI Engineer Intern, Agents
@ Occam AI | US
AI Research Scientist
@ Vara | Berlin, Germany and Remote