May 23, 2022, 1:12 a.m. | Maxwell Horton, Yanzi Jin, Ali Farhadi, Mohammad Rastegari

cs.CV updates on arXiv.org arxiv.org

We present a computationally efficient method for compressing a trained
neural network without using real data. We break the problem of data-free
network compression into independent layer-wise compressions. We show how to
efficiently generate layer-wise training data using only a pretrained network.
We use this data to perform independent layer-wise compressions on the
pretrained network. We also show how to precondition the network to improve the
accuracy of our layer-wise compression method. We present results for
layer-wise compression using quantization …

arxiv cnn compression cv data free

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Healthcare Data Modeler/Data Architect - REMOTE

@ Perficient | United States

Data Analyst – Sustainability, Green IT

@ H&M Group | Stockholm, Sweden

RWE Data Analyst

@ Sanofi | Hyderabad

Machine Learning Engineer

@ JPMorgan Chase & Co. | Jersey City, NJ, United States