## You are here

Homerelative entropy

## Primary tabs

# relative entropy

Let $p$ and $q$ be probability distributions with supports $\mathcal{X}$ and $\mathcal{Y}$ respectively, where $\mathcal{X}\subset\mathcal{Y}$. The *relative entropy* or *Kullback-Leibler* distance between two probability distributions $p$ and $q$ is defined as

$D(p||q):=\sum_{{x\in\mathcal{X}}}p(x)\log\frac{p(x)}{q(x)}.$ | (1) |

While $D(p||q)$ is often called a distance, it is not a true metric because it is not symmetric and does not satisfy the triangle inequality. However, we do have $D(p||q)\geq 0$ with equality iff $p=q$.

$\displaystyle-D(p||q)$ | $\displaystyle=-\sum_{{x\in\mathcal{X}}}p(x)\log\frac{p(x)}{q(x)}$ | (2) | ||

$\displaystyle=\sum_{{x\in\mathcal{X}}}p(x)\log\frac{q(x)}{p(x)}$ | (3) | |||

$\displaystyle\leq\log\left(\sum_{{x\in\mathcal{X}}}p(x)\frac{q(x)}{p(x)}\right)$ | (4) | |||

$\displaystyle=\log\left(\sum_{{x\in\mathcal{X}}}q(x)\right)$ | (5) | |||

$\displaystyle\leq\log\left(\sum_{{x\in\mathcal{Y}}}q(x)\right)$ | (6) | |||

$\displaystyle=0$ | (7) |

where the first inequality follows from the concavity of $\log(x)$ and the second from expanding the sum over the support of $q$ rather than $p$.

Relative entropy also comes in a continuous version which looks just as one might expect. For continuous distributions $f$ and $g$, $\mathcal{S}$ the support of $f$, we have

$D(f||g):=\int_{{\mathcal{S}}}f\log\frac{f}{g}.$ | (8) |

## Mathematics Subject Classification

60E05*no label found*94A17

*no label found*

- Forums
- Planetary Bugs
- HS/Secondary
- University/Tertiary
- Graduate/Advanced
- Industry/Practice
- Research Topics
- LaTeX help
- Math Comptetitions
- Math History
- Math Humor
- PlanetMath Comments
- PlanetMath System Updates and News
- PlanetMath help
- PlanetMath.ORG
- Strategic Communications Development
- The Math Pub
- Testing messages (ignore)

- Other useful stuff
- Corrections