all AI news
Disentangle and Remerge: Interventional Knowledge Distillation for Few-Shot Object Detection from A Conditional Causal Perspective. (arXiv:2208.12681v1 [cs.CV])
Aug. 29, 2022, 1:14 a.m. | Jiangmeng Li, Yanan Zhang, Wenwen Qiang, Lingyu Si, Chengbo Jiao, Xiaohui Hu, Changwen Zheng, Fuchun Sun
cs.CV updates on arXiv.org arxiv.org
Few-shot learning models learn representations with limited human
annotations, and such a learning paradigm demonstrates practicability in
various tasks, e.g., image classification, object detection, etc. However,
few-shot object detection methods suffer from an intrinsic defect that the
limited training data makes the model cannot sufficiently explore semantic
information. To tackle this, we introduce knowledge distillation to the
few-shot object detection learning paradigm. We further run a motivating
experiment, which demonstrates that in the process of knowledge distillation
the empirical error …
More from arxiv.org / cs.CV updates on arXiv.org
Jobs in AI, ML, Big Data
Software Engineer for AI Training Data (School Specific)
@ G2i Inc | Remote
Software Engineer for AI Training Data (Python)
@ G2i Inc | Remote
Software Engineer for AI Training Data (Tier 2)
@ G2i Inc | Remote
Data Engineer
@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania
Artificial Intelligence – Bioinformatic Expert
@ University of Texas Medical Branch | Galveston, TX
Lead Developer (AI)
@ Cere Network | San Francisco, US