March 15, 2024, 4:42 a.m. | Brandon Morgan, Dean Hougen

cs.LG updates on arXiv.org arxiv.org

arXiv:2403.08793v1 Announce Type: cross
Abstract: For classification, neural networks typically learn by minimizing cross-entropy, but are evaluated and compared using accuracy. This disparity suggests neural loss function search (NLFS), the search for a drop-in replacement loss function of cross-entropy for neural networks. We apply NLFS to image classifier convolutional neural networks. We propose a new search space for NLFS that encourages more diverse loss functions to be explored, and a surrogate function that accurately transfers to large-scale convolutional neural networks. …

abstract accuracy apply arxiv classification classifier convolutional neural networks cross-entropy cs.cv cs.lg entropy evolution function image learn loss networks neural networks replacement scale search type

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Data Science Analyst

@ Mayo Clinic | AZ, United States

Sr. Data Scientist (Network Engineering)

@ SpaceX | Redmond, WA