Kullback–Leibler divergence

Lingling Yang
1 min readMar 31, 2020

Kullback-Leibler Divergence

  • A distribution-wise asymmetric measure of how one probability distribution is different from a second reference probability distribution.
  • Applications: relative (Shannon) entropy in the information system, randomness in continuous time-series, information gain when comparing statistical models of inference.
  • For discrete probability distribution P and Q defined on the same probability space, the KL divergence from Q to P is defined as:
  • Properties

--

--