all AI news
Lossy Compression with Gaussian Diffusion. (arXiv:2206.08889v1 [stat.ML])
Web: http://arxiv.org/abs/2206.08889
June 20, 2022, 1:12 a.m. | Lucas Theis, Tim Salimans, Matthew D. Hoffman, Fabian Mentzer
stat.ML updates on arXiv.org arxiv.org
We describe a novel lossy compression approach called DiffC which is based on
unconditional diffusion generative models. Unlike modern compression schemes
which rely on transform coding and quantization to restrict the transmitted
information, DiffC relies on the efficient communication of pixels corrupted by
Gaussian noise. We implement a proof of concept and find that it works
surprisingly well despite the lack of an encoder transform, outperforming the
state-of-the-art generative compression method HiFiC on ImageNet 64x64. DiffC
only uses a single …
More from arxiv.org / stat.ML updates on arXiv.org
Latest AI/ML/Big Data Jobs
Machine Learning Researcher - Saalfeld Lab
@ Howard Hughes Medical Institute - Chevy Chase, MD | Ashburn, Virginia
Project Director, Machine Learning in US Health
@ ideas42.org | Remote, US
Data Science Intern
@ NannyML | Remote
Machine Learning Engineer NLP/Speech
@ Play.ht | Remote
Research Scientist, 3D Reconstruction
@ Yembo | Remote, US
Clinical Assistant or Associate Professor of Management Science and Systems
@ University at Buffalo | Buffalo, NY