all AI news
Finding the Task-Optimal Low-Bit Sub-Distribution in Deep Neural Networks. (arXiv:2112.15139v2 [cs.CV] UPDATED)
Jan. 4, 2022, 9:10 p.m. | Runpei Dong, Zhanhong Tan, Mengdi Wu, Linfeng Zhang, Kaisheng Ma
cs.CV updates on arXiv.org arxiv.org
Quantized neural networks typically require smaller memory footprints and
lower computation complexity, which is crucial for efficient deployment.
However, quantization inevitably leads to a distribution divergence from the
original network, which generally degrades the performance. To tackle this
issue, massive efforts have been made, but most existing approaches lack
statistical considerations and depend on several manual configurations. In this
paper, we present an adaptive-mapping quantization method to learn an optimal
latent sub-distribution that is inherent within models and smoothly
approximated …
More from arxiv.org / cs.CV updates on arXiv.org
Jobs in AI, ML, Big Data
Data Scientist (m/f/x/d)
@ Symanto Research GmbH & Co. KG | Spain, Germany
Enterprise Data Architect
@ Pathward | Remote
Diagnostic Imaging Information Systems (DIIS) Technologist
@ Nova Scotia Health Authority | Halifax, NS, CA, B3K 6R8
Intern Data Scientist - Residual Value Risk Management (f/m/d)
@ BMW Group | Munich, DE
Analytics Engineering Manager
@ PlayStation Global | United Kingdom, London
Junior Insight Analyst (PR&Comms)
@ Signal AI | Lisbon, Lisbon, Portugal