Article (Scientific journals)
Self-organizing mixture models
Verbeek, J. J.; Vlassis, Nikos; Krose, B. J. A.
2005In Neurocomputing, 63, p. 99-123
Peer Reviewed verified by ORBi
 

Files


Full Text
download.pdf
Author preprint (879.63 kB)
Download

All documents in ORBilu are protected by a user license.

Send to



Details



Keywords :
self-organizing maps; mixture model; EM algorithm
Abstract :
[en] We present an expectation-maximization (EM) algorithm that yields topology preserving maps of data based on probabilistic mixture models. Our approach is applicable to any mixture model for which we have a normal EM algorithm. Compared to other mixture model approaches to self-organizing maps (SOMs), the function our algorithm maximizes has a clear interpretation: it sums data log-likelihood and a penalty term that enforces self-organization. Our approach allows principled handling of missing data and learning of mixtures of SOMs. We present example applications illustrating our approach for continuous, discrete, and mixed discrete and continuous data. (C) 2004 Elsevier B.V. All rights reserved.
Disciplines :
Computer science
Identifiers :
UNILU:UL-ARTICLE-2011-727
Author, co-author :
Verbeek, J. J.
Vlassis, Nikos ;  University of Luxembourg > Luxembourg Centre for Systems Biomedicine (LCSB)
Krose, B. J. A.
Language :
English
Title :
Self-organizing mixture models
Publication date :
2005
Journal title :
Neurocomputing
ISSN :
0925-2312
Publisher :
Elsevier Science, Amsterdam, Netherlands
Volume :
63
Pages :
99-123
Peer reviewed :
Peer Reviewed verified by ORBi
Available on ORBilu :
since 17 November 2013

Statistics


Number of views
35 (0 by Unilu)
Number of downloads
224 (0 by Unilu)

Scopus citations®
 
66
Scopus citations®
without self-citations
64
OpenCitations
 
62
WoS citations
 
52

Bibliography


Similar publications



Contact ORBilu