all AI news
JEDI: Joint Expert Distillation in a Semi-Supervised Multi-Dataset Student-Teacher Scenario for Video Action Recognition. (arXiv:2308.04934v1 [cs.CV])
cs.LG updates on arXiv.org arxiv.org
We propose JEDI, a multi-dataset semi-supervised learning method, which
efficiently combines knowledge from multiple experts, learned on different
datasets, to train and improve the performance of individual, per dataset,
student models. Our approach achieves this by addressing two important problems
in current machine learning research: generalization across datasets and
limitations of supervised training due to scarcity of labeled data. We start
with an arbitrary number of experts, pretrained on their own specific dataset,
which form the initial set of student …
action recognition arxiv current dataset datasets distillation expert experts jedi knowledge machine machine learning multiple per performance recognition semi-supervised semi-supervised learning supervised learning video