all AI news
Approaching Rate-Distortion Limits in Neural Compression with Lattice Transform Coding
March 13, 2024, 4:42 a.m. | Eric Lei, Hamed Hassani, Shirin Saeedi Bidokhti
cs.LG updates on arXiv.org arxiv.org
Abstract: Neural compression has brought tremendous progress in designing lossy compressors with good rate-distortion (RD) performance at low complexity. Thus far, neural compression design involves transforming the source to a latent vector, which is then rounded to integers and entropy coded. While this approach has been shown to be optimal in a one-shot sense on certain sources, we show that it is highly sub-optimal on i.i.d. sequences, and in fact always recovers scalar quantization of the …
abstract arxiv coding complexity compression cs.it cs.lg design designing eess.sp entropy good integers lattice low math.it neural compression performance progress rate type vector
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
Founding AI Engineer, Agents
@ Occam AI | New York
AI Engineer Intern, Agents
@ Occam AI | US
AI Research Scientist
@ Vara | Berlin, Germany and Remote
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Sr. Software Development Manager, AWS Neuron Machine Learning Distributed Training
@ Amazon.com | Cupertino, California, USA