all AI news
Multi scale Feature Extraction and Fusion for Online Knowledge Distillation. (arXiv:2206.08224v1 [cs.CV])
cs.CV updates on arXiv.org arxiv.org
Online knowledge distillation conducts knowledge transfer among all student
models to alleviate the reliance on pre-trained models. However, existing
online methods rely heavily on the prediction distributions and neglect the
further exploration of the representational knowledge. In this paper, we
propose a novel Multi-scale Feature Extraction and Fusion method (MFEF) for
online knowledge distillation, which comprises three key components:
Multi-scale Feature Extraction, Dual-attention and Feature Fusion, towards
generating more informative feature maps for distillation. The multiscale
feature extraction exploiting divide-and-concatenate …
arxiv cv distillation extraction feature fusion knowledge scale