Aug. 29, 2022, 1:14 a.m. | Junjie Hu, Chenyou Fan, Mete Ozay, Hualie Jiang, Tin Lun Lam

cs.CV updates on arXiv.org arxiv.org

We study data-free knowledge distillation (KD) for monocular depth estimation
(MDE), which learns a lightweight network for real-world depth perception by
compressing from a trained expert model under the teacher-student framework
while lacking training data in the target domain. Owing to the essential
difference between dense regression and image recognition, previous methods of
data-free KD are not applicable to MDE. To strengthen the applicability in the
real world, in this paper, we seek to apply KD with out-of-distribution
simulated images. …

arxiv cv data distillation free

Lead Developer (AI)

@ Cere Network | San Francisco, US

Research Engineer

@ Allora Labs | Remote

Ecosystem Manager

@ Allora Labs | Remote

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote