Academic
Publications
Asymptotic Convergence Rate of the EM Algorithm for Gaussian Mixtures

Asymptotic Convergence Rate of the EM Algorithm for Gaussian Mixtures,Neural Computation,Jinwen Ma,Lei Xu,Michael I. Jordan

Asymptotic Convergence Rate of the EM Algorithm for Gaussian Mixtures   (Citations: 26)
BibTex | RIS | RefWorks Download
Itiswell known that the convergence rate of the expectation-maximization (EM) algorithm can be faster than those of convention érst-order iterative algorithms when the overlap in the given mixture is small. But this ar- gument has not been mathematically proved yet. This article studies this problem asymptotically in the setting of gaussian mixtures under the the- oretical framework of Xu and Jordan (1996). It has been proved that the asymptotic convergence rate of the EM algorithm for gaussian mixtures locally around the true solution 2¤ is o.e0:5¡".2¤//, where " > 0 is an arbi- trarily small number, o.x/ means that it is a higher-order inénitesimal as x ! 0, and e.2¤/ is a measure of the average overlap of gaussians in the mixture. In other words, the large sample local convergence rate for the EM algorithm tends to be asymptotically superlinear when e.2¤/ tends
Journal: Neural Computation - NECO , vol. 12, no. 12, pp. 2881-2907, 2001
Cumulative Annual
View Publication
The following links allow you to view full publications. These links are maintained by other sources not affiliated with Microsoft Academic Search.
    • ...Recent theoretical investigations on the asymptotic convergence properties of the EM algorithms for gaussian mixtures and the mixtures of densities from exponential families (Xu & Jordan, 1996; Xu, 1997; Ma, Xu, & Jordan, 2000; Ma & Xu, 2005; Ma & Fu, 2005) provide a new view on the EM algorithm for ME...

    Yan Yanget al. Asymptotic Convergence Properties of the EM Algorithm for Mixture of E...

    • ...With a particular configuration and algorithm, convergence properties are known [29,36], yet such a specific result does not apply to the more general use of ME, which has proven of benefit for different cognitive simulations [14,15,26]...

    Mike W. Shieldset al. A theoretical framework for multiple neural network systems

    • ...In the past decades, EM algorithm (Dempster et al., 1977) has become the most widely method used in the problem of learning Gaussian Mixture Model (Ma et al., 2001; Jordan & Xu, 1995; McLachlan & Krishnan, 1996)...
    • ...The convergence properties of EM algorithm over GMM have been extensively studied in (Xu & Jordan, 1996; Ma et al., 2001)...

    Zhenjie Zhanget al. Estimating local optimums in EM algorithm over Gaussian mixture model

    • ...For instance, there is still a small error even after 62 iterations in Fig. 3. The convergence rate of iterative fixed posteriori approximation, a type of EM-algorithm, is generally linear [13] while the conjugate gradient approach has at least a linear rate...
    • ...This may be because the Gaussian approximation usually makes the initialization close to the true result, and thus echoes the conclusions in [13] that state “for Gaussian mixtures locally around the true solution and when the overlap in the mixture is small the convergence rate for the EM-algorithm tends to be asymptotically superlinear.” That is, the iterative fixed posteriori approximation here shares this nature...

    Zhi-Yong Liuet al. Investigations on non-Gaussian factor analysis

    • ...Substantial literatures have been devoted to the study of the EM algorithm and found that it is generally a first-order or linearly convergent algorithm [24]‐[26]...

    Sin-horng Chenet al. A new duration modeling approach for Mandarin speech

Sort by: