all AI news
Advancing Pre-trained Teacher: Towards Robust Feature Discrepancy for Anomaly Detection
May 6, 2024, 4:45 a.m. | Canhui Tang, Sanping Zhou, Yizhe Li, Yonghao Dong, Le Wang
cs.CV updates on arXiv.org arxiv.org
Abstract: With the wide application of knowledge distillation between an ImageNet pre-trained teacher model and a learnable student model, industrial anomaly detection has witnessed a significant achievement in the past few years. The success of knowledge distillation mainly relies on how to keep the feature discrepancy between the teacher and student model, in which it assumes that: (1) the teacher model can jointly represent two different distributions for the normal and abnormal patterns, while (2) the …
abstract achievement anomaly anomaly detection application arxiv cs.cv detection distillation feature imagenet industrial knowledge robust success type
More from arxiv.org / cs.CV updates on arXiv.org
Multi-View Spectrogram Transformer for Respiratory Sound Classification
2 days, 18 hours ago |
arxiv.org
GaussianHead: High-fidelity Head Avatars with Learnable Gaussian Derivation
2 days, 18 hours ago |
arxiv.org
OTMatch: Improving Semi-Supervised Learning with Optimal Transport
2 days, 18 hours ago |
arxiv.org
Jobs in AI, ML, Big Data
Senior Machine Learning Engineer
@ GPTZero | Toronto, Canada
ML/AI Engineer / NLP Expert - Custom LLM Development (x/f/m)
@ HelloBetter | Remote
Doctoral Researcher (m/f/div) in Automated Processing of Bioimages
@ Leibniz Institute for Natural Product Research and Infection Biology (Leibniz-HKI) | Jena
Seeking Developers and Engineers for AI T-Shirt Generator Project
@ Chevon Hicks | Remote
Principal Data Architect - Azure & Big Data
@ MGM Resorts International | Home Office - US, NV
GN SONG MT Market Research Data Analyst 11
@ Accenture | Bengaluru, BDC7A