all AI news
Cross-Entropy Loss Functions: Theoretical Analysis and Applications. (arXiv:2304.07288v1 [cs.LG])
cs.LG updates on arXiv.org arxiv.org
Cross-entropy is a widely used loss function in applications. It coincides
with the logistic loss applied to the outputs of a neural network, when the
softmax is used. But, what guarantees can we rely on when using cross-entropy
as a surrogate loss? We present a theoretical analysis of a broad family of
losses, comp-sum losses, that includes cross-entropy (or logistic loss),
generalized cross-entropy, the mean absolute error and other loss
cross-entropy-like functions. We give the first $H$-consistency bounds for
these …
analysis applications arxiv cross-entropy entropy error family function generalized loss losses mean network neural network softmax