all AI news
Inducing Neural Collapse to a Fixed Hierarchy-Aware Frame for Reducing Mistake Severity. (arXiv:2303.05689v2 [cs.CV] UPDATED)
cs.CV updates on arXiv.org arxiv.org
There is a recently discovered and intriguing phenomenon called Neural
Collapse: at the terminal phase of training a deep neural network for
classification, the within-class penultimate feature means and the associated
classifier vectors of all flat classes collapse to the vertices of a simplex
Equiangular Tight Frame (ETF). Recent work has tried to exploit this phenomenon
by fixing the related classifier weights to a pre-computed ETF to induce neural
collapse and maximize the separation of the learned features when training …
arxiv classification classifier deep neural network feature network neural collapse neural network terminal training vectors