Web: http://arxiv.org/abs/2201.11097

Jan. 27, 2022, 2:10 a.m. | Qizhen Lan, Qing Tian

cs.CV updates on arXiv.org arxiv.org

In recent years, knowledge distillation (KD) has been widely used as an
effective way to derive efficient models. Through imitating a large teacher
model, a lightweight student model can achieve comparable performance with more
efficiency. However, most existing knowledge distillation methods are focused
on classification tasks. Only a limited number of studies have applied
knowledge distillation to object detection, especially in time-sensitive
autonomous driving scenarios. We propose the Adaptive Instance Distillation
(AID) method to selectively impart knowledge from the teacher …

arxiv autonomous autonomous driving cv detection distillation

More from arxiv.org / cs.CV updates on arXiv.org

Data Analytics and Technical support Lead

@ Coupa Software, Inc. | Bogota, Colombia

Data Science Manager

@ Vectra | San Jose, CA

Data Analyst Sr

@ Capco | Brazil - Sao Paulo

Data Scientist (NLP)

@ Builder.ai | London, England, United Kingdom - Remote

Senior Data Analyst

@ BuildZoom | Scottsdale, AZ/ San Francisco, CA/ Remote

Senior Research Scientist, Speech Recognition

@ SoundHound Inc. | Toronto, Canada