all AI news
MixSKD: Self-Knowledge Distillation from Mixup for Image Recognition. (arXiv:2208.05768v1 [cs.CV])
Aug. 12, 2022, 1:11 a.m. | Chuanguang Yang, Zhulin An, Helong Zhou, Linhang Cai, Xiang Zhi, Jiwen Wu, Yongjun Xu, Qian Zhang
cs.CV updates on arXiv.org arxiv.org
Unlike the conventional Knowledge Distillation (KD), Self-KD allows a network
to learn knowledge from itself without any guidance from extra networks. This
paper proposes to perform Self-KD from image Mixture (MixSKD), which integrates
these two techniques into a unified framework. MixSKD mutually distills feature
maps and probability distributions between the random pair of original images
and their mixup images in a meaningful way. Therefore, it guides the network to
learn cross-image knowledge by modelling supervisory signals from mixup images.
Moreover, …
More from arxiv.org / cs.CV updates on arXiv.org
Jobs in AI, ML, Big Data
Senior Data Engineer
@ Publicis Groupe | New York City, United States
Associate Principal Robotics Engineer - Research.
@ Dyson | United Kingdom - Hullavington Office
Duales Studium mit vertiefter Praxis: Bachelor of Science Künstliche Intelligenz und Data Science (m/w/d)
@ Gerresheimer | Wackersdorf, Germany
AI/ML Engineer (TS/SCI) {S}
@ ARKA Group, LP | Aurora, Colorado, United States
Data Integration Engineer
@ Find.co | Sliema
Data Engineer
@ Q2 | Bengaluru, India