Jan. 27, 2022, 2:10 a.m. | Hanqiu Deng, Xingyu Li

cs.CV updates on arXiv.org arxiv.org

Knowledge distillation (KD) achieves promising results on the challenging
problem of unsupervised anomaly detection (AD).The representation discrepancy
of anomalies in the teacher-student (T-S) model provides essential evidence for
AD. However, using similar or identical architectures to build the teacher and
student models in previous studies hinders the diversity of anomalous
representations. To tackle this problem, we propose a novel T-S model
consisting of a teacher encoder and a student decoder and introduce a simple
yet effective "reverse distillation" paradigm accordingly. …

anomaly detection arxiv cv detection distillation embedding

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Data Analyst (H/F)

@ Business & Decision | Montpellier, France

Machine Learning Researcher

@ VERSES | Brighton, England, United Kingdom - Remote