all AI news
Anomaly Detection via Reverse Distillation from One-Class Embedding. (arXiv:2201.10703v1 [cs.CV])
Jan. 27, 2022, 2:10 a.m. | Hanqiu Deng, Xingyu Li
cs.CV updates on arXiv.org arxiv.org
Knowledge distillation (KD) achieves promising results on the challenging
problem of unsupervised anomaly detection (AD).The representation discrepancy
of anomalies in the teacher-student (T-S) model provides essential evidence for
AD. However, using similar or identical architectures to build the teacher and
student models in previous studies hinders the diversity of anomalous
representations. To tackle this problem, we propose a novel T-S model
consisting of a teacher encoder and a student decoder and introduce a simple
yet effective "reverse distillation" paradigm accordingly. …
More from arxiv.org / cs.CV updates on arXiv.org
Jobs in AI, ML, Big Data
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Data Analyst (H/F)
@ Business & Decision | Montpellier, France
Machine Learning Researcher
@ VERSES | Brighton, England, United Kingdom - Remote