01 January 1970 0 2K Report

Kullback-Leibler divergence is a measure of how different two probability distributions are. It is based on the concept of information theory, which studies how to quantify and communicate information efficiently. Information theory defines the entropy of a probability distribution as the average amount of information needed to describe an outcome from that distribution. The more uncertain the outcome, the higher the entropy. Similarly, the cross-entropy of two probability distributions is the average amount of information needed to describe an outcome from one distribution using the other distribution. The lower the cross-entropy, the more similar the two distributions are. Kullback-Leibler divergence is defined as the difference between the cross-entropy and the entropy of the first distribution. It can be interpreted as the amount of information lost when using the second distribution to approximate the first distribution. The higher the Kullback-Leibler divergence, the more different the two distributions are.

https://labstats.blogspot.com/2023/04/kullback-leibler-divergence.html

More Pietro Piu's questions See All
Similar questions and discussions