all AI news
Data Upcycling Knowledge Distillation for Image Super-Resolution
April 5, 2024, 4:45 a.m. | Yun Zhang, Wei Li, Simiao Li, Hanting Chen, Zhijun Tu, Wenjia Wang, Bingyi Jing, Shaohui Lin, Jie Hu
cs.CV updates on arXiv.org arxiv.org
Abstract: Knowledge distillation (KD) compresses deep neural networks by transferring task-related knowledge from cumbersome pre-trained teacher models to compact student models. However, current KD methods for super-resolution (SR) networks overlook the nature of SR task that the outputs of the teacher model are noisy approximations to the ground-truth distribution of high-quality images (GT), which shades the teacher model's knowledge to result in limited KD effects. To utilize the teacher model beyond the GT upper-bound, we present …
abstract arxiv compact cs.ai cs.cv current data distillation distribution ground-truth however image knowledge nature networks neural networks resolution truth type
More from arxiv.org / cs.CV updates on arXiv.org
Multi-View Spectrogram Transformer for Respiratory Sound Classification
2 days, 15 hours ago |
arxiv.org
GaussianHead: High-fidelity Head Avatars with Learnable Gaussian Derivation
2 days, 15 hours ago |
arxiv.org
OTMatch: Improving Semi-Supervised Learning with Optimal Transport
2 days, 15 hours ago |
arxiv.org
Jobs in AI, ML, Big Data
Senior Machine Learning Engineer
@ GPTZero | Toronto, Canada
ML/AI Engineer / NLP Expert - Custom LLM Development (x/f/m)
@ HelloBetter | Remote
Doctoral Researcher (m/f/div) in Automated Processing of Bioimages
@ Leibniz Institute for Natural Product Research and Infection Biology (Leibniz-HKI) | Jena
Seeking Developers and Engineers for AI T-Shirt Generator Project
@ Chevon Hicks | Remote
GN SONG MT Market Research Data Analyst 11
@ Accenture | Bengaluru, BDC7A
GN SONG MT Market Research Data Analyst 09
@ Accenture | Bengaluru, BDC7A