all AI news
On the R\'{e}nyi Cross-Entropy. (arXiv:2206.14329v1 [cs.IT])
June 30, 2022, 1:10 a.m. | Ferenc Cole Thierrin, Fady Alajaji, Tamás Linder
cs.LG updates on arXiv.org arxiv.org
The R\'{e}nyi cross-entropy measure between two distributions, a
generalization of the Shannon cross-entropy, was recently used as a loss
function for the improved design of deep learning generative adversarial
networks. In this work, we examine the properties of this measure and derive
closed-form expressions for it when one of the distributions is fixed and when
both distributions belong to the exponential family. We also analytically
determine a formula for the cross-entropy rate for stationary Gaussian
processes and for finite-alphabet Markov …
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
Founding AI Engineer, Agents
@ Occam AI | New York
AI Engineer Intern, Agents
@ Occam AI | US
AI Research Scientist
@ Vara | Berlin, Germany and Remote
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne