Abstract
In this paper we address the problem of learning Gaussian Mixture Models (GMMs) incrementally. Unlike previous approaches which universally assume that new data comes in blocks representable by GMMs which are then merged with the current model estimate, our method works for the case when novel data points arrive one-by-one, while requiring little additional memory. We keep only two GMMs in the memory and no historical data. The current fit is updated with the assumption that the number of components is fixed, which is increased (or reduced) when enough evidence for a new component is seen. This is deduced from the change from the oldest fit of the same complexity, termed the Historical GMM, the concept of which is central to our method. The performance of the proposed method is demonstrated qualitatively and quantitatively on several synthetic data sets and video sequences of faces acquired in realistic imaging conditions.
Original language | English |
---|---|
Title of host publication | BMVC 2005 - Proceedings of the British Machine Vision Conference 2005 |
Publisher | British Machine Vision Association, BMVA |
DOIs | |
Publication status | Published - 2005 |
Event | 2005 16th British Machine Vision Conference, BMVC 2005 - Oxford, United Kingdom Duration: 5 Sept 2005 → 8 Sept 2005 |
Conference
Conference | 2005 16th British Machine Vision Conference, BMVC 2005 |
---|---|
Country/Territory | United Kingdom |
City | Oxford |
Period | 5/09/05 → 8/09/05 |