all AI news
Multi-Scale Architectures Matter: On the Adversarial Robustness of Flow-based Lossless Compression. (arXiv:2208.12716v1 [cs.CV])
Aug. 29, 2022, 1:14 a.m. | Yi-chong Xia, Bin Chen, Yan Feng, Tian-shuo Ge
cs.CV updates on arXiv.org arxiv.org
As a probabilistic modeling technique, the flow-based model has demonstrated
remarkable potential in the field of lossless compression
\cite{idf,idf++,lbb,ivpf,iflow},. Compared with other deep generative models
(eg. Autoregressive, VAEs) \cite{bitswap,hilloc,pixelcnn++,pixelsnail} that
explicitly model the data distribution probabilities, flow-based models perform
better due to their excellent probability density estimation and satisfactory
inference speed. In flow-based models, multi-scale architecture provides a
shortcut from the shallow layer to the output layer, which significantly
reduces the computational complexity and avoid performance degradation when
adding more …
More from arxiv.org / cs.CV updates on arXiv.org
Jobs in AI, ML, Big Data
Lead Developer (AI)
@ Cere Network | San Francisco, US
Research Engineer
@ Allora Labs | Remote
Ecosystem Manager
@ Allora Labs | Remote
Founding AI Engineer, Agents
@ Occam AI | New York
AI Engineer Intern, Agents
@ Occam AI | US
AI Research Scientist
@ Vara | Berlin, Germany and Remote