Abstract
We present an exact Bayesian treatment of a simple, yet sufficiently general probability distribution model. We consider piecewise-constant distributions' P(X) with uniform (second-order) prior over location of discontinuity points and assigned chances. The predictive distribution and the model complexity can be determined completely from the data in a computational time that is linear in the number of degrees of freedom and quadratic in the number of possible values of X. Furthermore, exact values of the expectations of entropies and their variances can be computed with polynomial effort. The expectation of the mutual information becomes thus available, too, and a strict upper bound on its variance. The resulting algorithm is particularly useful in experimental research areas where the number of available samples is severely limited (e.g., neurophysiology). Estimates on a simulated data set provide more accurate results than using a previously proposed method.
Original language | English |
---|---|
Pages (from-to) | 3766-3779 |
Number of pages | 14 |
Journal | IEEE Transactions on Information Theory |
Volume | 51 |
Issue number | 11 |
DOIs | |
Publication status | Published - Nov 2005 |
Keywords
- Bayesian inference
- Entropy
- Model selection
- Mutual information