Web: http://arxiv.org/abs/2202.08978

June 17, 2022, 1:11 a.m. | Leslie N. Smith

cs.LG updates on arXiv.org arxiv.org

The cross-entropy softmax loss is the primary loss function used to train
deep neural networks. On the other hand, the focal loss function has been
demonstrated to provide improved performance when there is an imbalance in the
number of training samples in each class, such as in long-tailed datasets. In
this paper, we introduce a novel cyclical focal loss and demonstrate that it is
a more universal loss function than cross-entropy softmax loss or focal loss.
We describe the intuition …

arxiv cv loss

More from arxiv.org / cs.LG updates on arXiv.org

Machine Learning Researcher - Saalfeld Lab

@ Howard Hughes Medical Institute - Chevy Chase, MD | Ashburn, Virginia

Project Director, Machine Learning in US Health

@ ideas42.org | Remote, US

Data Science Intern

@ NannyML | Remote

Machine Learning Engineer NLP/Speech

@ Play.ht | Remote

Research Scientist, 3D Reconstruction

@ Yembo | Remote, US

Clinical Assistant or Associate Professor of Management Science and Systems

@ University at Buffalo | Buffalo, NY