Abstract
We introduce a metric for probability distributions, which is bounded, information-theoretically motivated, and has a natural Bayesian interpretation. The square root of the well-known chi(2) distance is an asymptotic approximation to it. Moreover, it is a close relative of the capacitory discrimination and Jensen-Shannon divergence.
| Original language | English |
|---|---|
| Pages (from-to) | 1858- 1860 |
| Number of pages | 3 |
| Journal | IEEE Transactions on Information Theory |
| Volume | 49 |
| Issue number | 7 |
| DOIs | |
| Publication status | Published - Jul 2003 |
Keywords
- Capacitory discrimination
- Chi(2) distance
- Jensen-Shannon divergence
- Metric
- Triangle inequality
- Discrimination
- Information
- Divergence