June 17, 2022, 1:13 a.m. | Panpan Zou, Yinglei Teng, Tao Niu

cs.CV updates on arXiv.org arxiv.org

Online knowledge distillation conducts knowledge transfer among all student
models to alleviate the reliance on pre-trained models. However, existing
online methods rely heavily on the prediction distributions and neglect the
further exploration of the representational knowledge. In this paper, we
propose a novel Multi-scale Feature Extraction and Fusion method (MFEF) for
online knowledge distillation, which comprises three key components:
Multi-scale Feature Extraction, Dual-attention and Feature Fusion, towards
generating more informative feature maps for distillation. The multiscale
feature extraction exploiting divide-and-concatenate …

arxiv cv distillation extraction feature fusion knowledge scale

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Data Integration Specialist

@ Accenture Federal Services | San Antonio, TX

Geospatial Data Engineer - Location Intelligence

@ Allegro | Warsaw, Poland

Site Autonomy Engineer (Onsite)

@ May Mobility | Tokyo, Japan

Summer Intern, AI (Artificial Intelligence)

@ Nextech Systems | Tampa, FL

Permitting Specialist/Wetland Scientist

@ AECOM | Chelmsford, MA, United States