Article (Scientific journals)
A greedy EM algorithm for Gaussian mixture learning
Vlassis, Nikos; Likas, A.
2002In Neural Processing Letters, 15 (1), p. 77-87
Peer reviewed
 

Files


Full Text
download.pdf
Publisher postprint (126.3 kB)
Request a copy

All documents in ORBilu are protected by a user license.

Send to



Details



Abstract :
[en] Learning a Gaussian mixture with a local algorithm like EM can be difficult because (i) the true number of mixing components is usually unknown, (ii) there is no generally accepted method for parameter initialization, and (iii) the algorithm can get trapped in one of the many local maxima of the likelihood function. In this paper we propose a greedy algorithm for learning a Gaussian mixture which tries to overcome these limitations. In particular, starting with a single component and adding components sequentially until a maximum number k, the algorithm is capable of achieving solutions superior to EM with k components in terms of the likelihood of a test set. The algorithm is based on recent theoretical results on incremental mixture density estimation, and uses a combination of global and local search each time a new component is added to the mixture.
Disciplines :
Computer science
Identifiers :
UNILU:UL-ARTICLE-2011-743
Author, co-author :
Vlassis, Nikos ;  University of Luxembourg > Luxembourg Centre for Systems Biomedicine (LCSB)
Likas, A.
Language :
English
Title :
A greedy EM algorithm for Gaussian mixture learning
Publication date :
2002
Journal title :
Neural Processing Letters
ISSN :
1370-4621
Publisher :
Kluwer
Volume :
15
Issue :
1
Pages :
77-87
Peer reviewed :
Peer reviewed
Available on ORBilu :
since 17 November 2013

Statistics


Number of views
67 (0 by Unilu)
Number of downloads
0 (0 by Unilu)

Scopus citations®
 
267
Scopus citations®
without self-citations
247
WoS citations
 
204

Bibliography


Similar publications



Contact ORBilu