April 17, 2023, 8:02 p.m. | Anqi Mao, Mehryar Mohri, Yutao Zhong

cs.LG updates on arXiv.org arxiv.org

Cross-entropy is a widely used loss function in applications. It coincides
with the logistic loss applied to the outputs of a neural network, when the
softmax is used. But, what guarantees can we rely on when using cross-entropy
as a surrogate loss? We present a theoretical analysis of a broad family of
losses, comp-sum losses, that includes cross-entropy (or logistic loss),
generalized cross-entropy, the mean absolute error and other loss
cross-entropy-like functions. We give the first $H$-consistency bounds for
these …

analysis applications arxiv cross-entropy entropy error family function generalized loss losses mean network neural network softmax

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne