Web: http://arxiv.org/abs/2206.08224

June 17, 2022, 1:13 a.m. | Panpan Zou, Yinglei Teng, Tao Niu

cs.CV updates on arXiv.org arxiv.org

Online knowledge distillation conducts knowledge transfer among all student
models to alleviate the reliance on pre-trained models. However, existing
online methods rely heavily on the prediction distributions and neglect the
further exploration of the representational knowledge. In this paper, we
propose a novel Multi-scale Feature Extraction and Fusion method (MFEF) for
online knowledge distillation, which comprises three key components:
Multi-scale Feature Extraction, Dual-attention and Feature Fusion, towards
generating more informative feature maps for distillation. The multiscale
feature extraction exploiting divide-and-concatenate …

arxiv cv distillation extraction feature fusion knowledge online scale

More from arxiv.org / cs.CV updates on arXiv.org

Machine Learning Researcher - Saalfeld Lab

@ Howard Hughes Medical Institute - Chevy Chase, MD | Ashburn, Virginia

Project Director, Machine Learning in US Health

@ ideas42.org | Remote, US

Data Science Intern

@ NannyML | Remote

Machine Learning Engineer NLP/Speech

@ Play.ht | Remote

Research Scientist, 3D Reconstruction

@ Yembo | Remote, US

Clinical Assistant or Associate Professor of Management Science and Systems

@ University at Buffalo | Buffalo, NY