Aug. 10, 2023, 4:50 a.m. | Tong Liang, Jim Davis

cs.CV updates on arXiv.org arxiv.org

There is a recently discovered and intriguing phenomenon called Neural
Collapse: at the terminal phase of training a deep neural network for
classification, the within-class penultimate feature means and the associated
classifier vectors of all flat classes collapse to the vertices of a simplex
Equiangular Tight Frame (ETF). Recent work has tried to exploit this phenomenon
by fixing the related classifier weights to a pre-computed ETF to induce neural
collapse and maximize the separation of the learned features when training …

arxiv classification classifier deep neural network feature network neural collapse neural network terminal training vectors

Lead Developer (AI)

@ Cere Network | San Francisco, US

Research Engineer

@ Allora Labs | Remote

Ecosystem Manager

@ Allora Labs | Remote

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote