Feb. 11, 2022, 3:32 p.m. | J. Rafid S., PhD

Towards Data Science - Medium towardsdatascience.com

Figure 1: Two probability distributions sampled from normal distribution (Image by author)

It is a common practice to use cross-entropy in the loss function while constructing a Generative Adversarial Network [1] even though original concept suggests the use of KL-divergence. This creates confusion often for the person new to the field. In this article we go through the concepts of entropy, cross-entropy and Kullback-Leibler Divergence [2] and see how they can be approximated to be equal.

The concept of entropy …

cross-entropy deep learning divergence entropy kl-divergence machine learning

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Data Engineer - Takealot Group (Takealot.com | Superbalist.com | Mr D Food)

@ takealot.com | Cape Town