all AI news
Are All Losses Created Equal: A Neural Collapse Perspective. (arXiv:2210.02192v2 [cs.LG] UPDATED)
Oct. 11, 2022, 1:15 a.m. | Jinxin Zhou, Chong You, Xiao Li, Kangning Liu, Sheng Liu, Qing Qu, Zhihui Zhu
cs.LG updates on arXiv.org arxiv.org
While cross entropy (CE) is the most commonly used loss to train deep neural
networks for classification tasks, many alternative losses have been developed
to obtain better empirical performance. Among them, which one is the best to
use is still a mystery, because there seem to be multiple factors affecting the
answer, such as properties of the dataset, the choice of network architecture,
and so on. This paper studies the choice of loss function by examining the
last-layer features of …
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
Software Engineer for AI Training Data (School Specific)
@ G2i Inc | Remote
Software Engineer for AI Training Data (Python)
@ G2i Inc | Remote
Software Engineer for AI Training Data (Tier 2)
@ G2i Inc | Remote
Data Engineer
@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania
Artificial Intelligence – Bioinformatic Expert
@ University of Texas Medical Branch | Galveston, TX
Lead Developer (AI)
@ Cere Network | San Francisco, US