April 22, 2024, 4:42 a.m. | Shakhnaz Akhmedova (Center for Artificial Intelligence in Public Health Research, Robert Koch Institute, Berlin, Germany), Nils K\"orber (Center for A

cs.LG updates on arXiv.org arxiv.org

arXiv:2404.12948v1 Announce Type: cross
Abstract: Neural networks are trained by minimizing a loss function that defines the discrepancy between the predicted model output and the target value. The selection of the loss function is crucial to achieve task-specific behaviour and highly influences the capability of the model. A variety of loss functions have been proposed for a wide range of tasks affecting training and model performance. For classification tasks, the cross entropy is the de-facto standard and usually the first …

abstract arxiv capability classification cs.cv cs.lg cs.ne function functions image loss networks neural networks next type value

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Lead Data Modeler

@ Sherwin-Williams | Cleveland, OH, United States