all AI news
Mind the Gap in Distilling StyleGANs. (arXiv:2208.08840v1 [cs.CV])
Aug. 19, 2022, 1:12 a.m. | Guodong Xu, Yuenan Hou, Ziwei Liu, Chen Change Loy
cs.CV updates on arXiv.org arxiv.org
StyleGAN family is one of the most popular Generative Adversarial Networks
(GANs) for unconditional generation. Despite its impressive performance, its
high demand on storage and computation impedes their deployment on
resource-constrained devices. This paper provides a comprehensive study of
distilling from the popular StyleGAN-like architecture. Our key insight is that
the main challenge of StyleGAN distillation lies in the output discrepancy
issue, where the teacher and student model yield different outputs given the
same input latent code. Standard knowledge distillation …
More from arxiv.org / cs.CV updates on arXiv.org
Jobs in AI, ML, Big Data
Data Scientist (m/f/x/d)
@ Symanto Research GmbH & Co. KG | Spain, Germany
Data Analyst, Tableau
@ NTT DATA | Bengaluru, KA, IN
Junior Machine Learning Researcher
@ Weill Cornell Medicine | Doha, QA, 24144
Marketing Data Analytics Intern
@ Sloan | Franklin Park, IL, US, 60131
Senior Machine Learning Scientist
@ Adyen | Amsterdam
Data Engineer
@ Craft.co | Warsaw, Mazowieckie