all AI news
Inducing Neural Collapse in Imbalanced Learning: Do We Really Need a Learnable Classifier at the End of Deep Neural Network?. (arXiv:2203.09081v3 [cs.LG] UPDATED)
Oct. 13, 2022, 1:17 a.m. | Yibo Yang, Shixiang Chen, Xiangtai Li, Liang Xie, Zhouchen Lin, Dacheng Tao
cs.CV updates on arXiv.org arxiv.org
Modern deep neural networks for classification usually jointly learn a
backbone for representation and a linear classifier to output the logit of each
class. A recent study has shown a phenomenon called neural collapse that the
within-class means of features and the classifier vectors converge to the
vertices of a simplex equiangular tight frame (ETF) at the terminal phase of
training on a balanced dataset. Since the ETF geometric structure maximally
separates the pair-wise angles of all classes in the …
arxiv classifier deep neural network network neural collapse neural network
More from arxiv.org / cs.CV updates on arXiv.org
Jobs in AI, ML, Big Data
Software Engineer for AI Training Data (School Specific)
@ G2i Inc | Remote
Software Engineer for AI Training Data (Python)
@ G2i Inc | Remote
Software Engineer for AI Training Data (Tier 2)
@ G2i Inc | Remote
Data Engineer
@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania
Artificial Intelligence – Bioinformatic Expert
@ University of Texas Medical Branch | Galveston, TX
Lead Developer (AI)
@ Cere Network | San Francisco, US