all AI news
MTKD: Multi-Teacher Knowledge Distillation for Image Super-Resolution
April 16, 2024, 4:48 a.m. | Yuxuan Jiang, Chen Feng, Fan Zhang, David Bull
cs.CV updates on arXiv.org arxiv.org
Abstract: Knowledge distillation (KD) has emerged as a promising technique in deep learning, typically employed to enhance a compact student network through learning from their high-performance but more complex teacher variant. When applied in the context of image super-resolution, most KD approaches are modified versions of methods developed for other computer vision tasks, which are based on training strategies with a single teacher and simple loss functions. In this paper, we propose a novel Multi-Teacher Knowledge …
abstract arxiv compact context cs.cv deep learning distillation eess.iv image knowledge network performance resolution through type versions
More from arxiv.org / cs.CV updates on arXiv.org
Jobs in AI, ML, Big Data
Artificial Intelligence – Bioinformatic Expert
@ University of Texas Medical Branch | Galveston, TX
Lead Developer (AI)
@ Cere Network | San Francisco, US
Research Engineer
@ Allora Labs | Remote
Ecosystem Manager
@ Allora Labs | Remote
Founding AI Engineer, Agents
@ Occam AI | New York
AI Engineer Intern, Agents
@ Occam AI | US