Learning local factor analysis versus mixture of factor analyzers with automatic model selection

L Shi, ZY Liu, S Tu, L Xu - Neurocomputing, 2014 - Elsevier
L Shi, ZY Liu, S Tu, L Xu
Neurocomputing, 2014Elsevier
Abstract Considering Factor Analysis (FA) for each component of Gaussian Mixture Model
(GMM), clustering and local dimensionality reduction can be addressed simultaneously by
Mixture of Factor Analyzers (MFA) and Local Factor Analysis (LFA), which correspond to two
FA parameterizations, respectively. This paper investigates the performance of Variational
Bayes (VB) and Bayesian Ying-Yang (BYY) harmony learning on MFA/LFA for the problem
of automatically determining the component number and the local hidden dimensionalities …
Abstract
Considering Factor Analysis (FA) for each component of Gaussian Mixture Model (GMM), clustering and local dimensionality reduction can be addressed simultaneously by Mixture of Factor Analyzers (MFA) and Local Factor Analysis (LFA), which correspond to two FA parameterizations, respectively. This paper investigates the performance of Variational Bayes (VB) and Bayesian Ying-Yang (BYY) harmony learning on MFA/LFA for the problem of automatically determining the component number and the local hidden dimensionalities (i.e., the number of factors of FA in each component). Similar to the existing VB learning algorithm on MFA, we develop an alternative VB algorithm on LFA with a similar conjugate Dirichlet–Normal–Gamma (DNG) prior on all parameters of LFA. Also, the corresponding BYY algorithms are developed for MFA and LFA. A wide range of synthetic experiments shows that LFA is superior to MFA in model selection under either VB or BYY, while BYY outperforms VB reliably on both MFA and LFA. These empirical findings are consistently observed from real applications on not only face and handwritten digit images clustering, but also unsupervised image segmentation.
Elsevier
Showing the best result for this search. See all results