Web: http://arxiv.org/abs/2108.09183

Jan. 31, 2022, 2:10 a.m. | Andrey Sheka, Victor Samun

cs.CV updates on arXiv.org arxiv.org

We propose a response-based method of knowledge distillation (KD) for the
head pose estimation problem. A student model trained by the proposed KD
achieves results better than a teacher model, which is atypical for the
response-based method. Our method consists of two stages. In the first stage,
we trained the base neural network (NN), which has one regression head and four
regression via classification (RvC) heads. We build the convolutional ensemble
over the base NN using offsets of face bounding …

arxiv cv distillation knowledge

More from arxiv.org / cs.CV updates on arXiv.org

Data Scientist

@ Fluent, LLC | Boca Raton, Florida, United States

Big Data ETL Engineer

@ Binance.US | Vancouver

Data Scientist / Data Engineer

@ Kin + Carta | Chicago

Data Engineer

@ Craft | Warsaw, Masovian Voivodeship, Poland

Senior Manager, Data Analytics Audit

@ Affirm | Remote US

Data Scientist - Nationwide Opportunities, AWS Professional Services

@ Amazon.com | US, NC, Virtual Location - N Carolina