Kullback–Leibler divergence

Lingling Yang
1 min readMar 31, 2020

Kullback-Leibler Divergence

  • A distribution-wise asymmetric measure of how one probability distribution is different from a second reference probability distribution.
  • Applications: relative (Shannon) entropy in the information system, randomness in continuous time-series, information gain when comparing statistical models of inference.
  • For discrete probability distribution P and Q defined on the same probability space, the KL divergence from Q to P is defined as:
  • Properties

Sign up to discover human stories that deepen your understanding of the world.

Free

Distraction-free reading. No ads.

Organize your knowledge with lists and highlights.

Tell your story. Find your audience.

Membership

Read member-only stories

Support writers you read most

Earn money for your writing

Listen to audio narrations

Read offline with the Medium app

No responses yet

Write a response