Sign in
Author

Conference

Journal

Organization

Year

DOI
Look for results that meet for the following criteria:
since
equal to
before
between
and
Search in all fields of study
Limit my searches in the following fields of study
Agriculture Science
Arts & Humanities
Biology
Chemistry
Computer Science
Economics & Business
Engineering
Environmental Sciences
Geosciences
Material Science
Mathematics
Medicine
Physics
Social Science
Multidisciplinary
Keywords
(7)
Convergence Rate
Em Algorithm
Expectation Maximization
Gaussian Mixture
Iterative Algorithm
Local Convergence
Higher Order
Related Publications
(5)
An Efficient kMeans Clustering Algorithm: Analysis and Implementation
Convergence results for the EM approach to mixtures of experts architectures
On Convergence Properties of the EM Algorithm for Gaussian Mixtures
The em algorithm and extensions
Mixture densities, maximum likelihood, and the EM algorithm
Subscribe
Academic
Publications
Asymptotic Convergence Rate of the EM Algorithm for Gaussian Mixtures
Asymptotic Convergence Rate of the EM Algorithm for Gaussian Mixtures,Neural Computation,Jinwen Ma,Lei Xu,Michael I. Jordan
Edit
Asymptotic Convergence Rate of the EM Algorithm for Gaussian Mixtures
(
Citations: 26
)
BibTex

RIS

RefWorks
Download
Jinwen Ma
,
Lei Xu
,
Michael I. Jordan
Itiswell known that the
convergence rate
of the expectationmaximization (EM) algorithm can be faster than those of convention érstorder iterative algorithms when the overlap in the given mixture is small. But this ar gument has not been mathematically proved yet. This article studies this problem asymptotically in the setting of gaussian mixtures under the the oretical framework of Xu and Jordan (1996). It has been proved that the asymptotic
convergence rate
of the
EM algorithm
for gaussian mixtures locally around the true solution 2¤ is o.e0:5¡".2¤//, where " > 0 is an arbi trarily small number, o.x/ means that it is a higherorder inénitesimal as x ! 0, and e.2¤/ is a measure of the average overlap of gaussians in the mixture. In other words, the large sample
local convergence
rate for the
EM algorithm
tends to be asymptotically superlinear when e.2¤/ tends
Journal:
Neural Computation  NECO
, vol. 12, no. 12, pp. 28812907, 2001
Cumulative
Annual
View Publication
The following links allow you to view full publications. These links are maintained by other sources not affiliated with Microsoft Academic Search.
(
www.cse.cuhk.edu.hk
)
(
www.informatik.unitrier.de
)
Citation Context
(11)
...Recent theoretical investigations on the asymptotic convergence properties of the EM algorithms for gaussian mixtures and the mixtures of densities from exponential families (Xu & Jordan,
1996
; Xu,
1997
; Ma, Xu, & Jordan,
2000
; Ma & Xu,
2005
; Ma & Fu,
2005
) provide a new view on the EM algorithm for ME...
Yan Yang
,
et al.
Asymptotic Convergence Properties of the EM Algorithm for Mixture of E...
...With a particular configuration and algorithm, convergence properties are known [29,
36
], yet such a specific result does not apply to the more general use of ME, which has proven of benefit for different cognitive simulations [14,15,26]...
Mike W. Shields
,
et al.
A theoretical framework for multiple neural network systems
...In the past decades, EM algorithm (Dempster et al., 1977) has become the most widely method used in the problem of learning Gaussian Mixture Model (
Ma et al., 2001;
Jordan & Xu, 1995; McLachlan & Krishnan, 1996)...
...The convergence properties of EM algorithm over GMM have been extensively studied in (Xu & Jordan, 1996;
Ma et al., 2001
)...
Zhenjie Zhang
,
et al.
Estimating local optimums in EM algorithm over Gaussian mixture model
...For instance, there is still a small error even after 62 iterations in Fig. 3. The convergence rate of iterative fixed posteriori approximation, a type of EMalgorithm, is generally linear [
13
] while the conjugate gradient approach has at least a linear rate...
...This may be because the Gaussian approximation usually makes the initialization close to the true result, and thus echoes the conclusions in [
13
] that state “for Gaussian mixtures locally around the true solution and when the overlap in the mixture is small the convergence rate for the EMalgorithm tends to be asymptotically superlinear.” That is, the iterative fixed posteriori approximation here shares this nature...
ZhiYong Liu
,
et al.
Investigations on nonGaussian factor analysis
...Substantial literatures have been devoted to the study of the EM algorithm and found that it is generally a firstorder or linearly convergent algorithm [
24
]‐[26]...
Sinhorng Chen
,
et al.
A new duration modeling approach for Mandarin speech
References
(16)
Hierarchical Mixtures of Experts and the EM Algorithm
(
Citations: 1286
)
Michael I. Jordan
,
Robert A. Jacobs
Journal:
Neural Computation  NECO
, vol. 6, no. 2, pp. 181214, 1994
A gradient algorithm locally equivalent to the EM algorithm
(
Citations: 105
)
K. Lange
Published in 1995.
On Convergence Properties of the EM Algorithm for Gaussian Mixtures
(
Citations: 239
)
Lei Xu
,
Michael I. Jordan
Journal:
Neural Computation  NECO
, vol. 8, no. 1, pp. 129151, 1996
Maximum likelihood from incomplete data via the em algorithm
(
Citations: 14070
)
Arthur P. Dempster
,
Nan M. Laird
,
Donald B. Rubin
Published in 1977.
Acceleration of the EM Algorithm by using QuasiNewton Methods
(
Citations: 84
)
Mortaza Jamshidian
,
Robert I. Jennrich
Journal:
Journal of The Royal Statistical Society Series Bstatistical Methodology  J ROY STAT SOC SER BSTAT MET
, vol. 59, no. 3, pp. 569587, 1997
Sort by:
Citations
(26)
Asymptotic Convergence Properties of the EM Algorithm for Mixture of Experts
Yan Yang
,
Jinwen Ma
Journal:
Neural Computation  NECO
, vol. 23, no. 8, pp. 21402168, 2011
A distributed EM algorithm to estimate the parameters of a finite mixture of components
(
Citations: 5
)
Behrooz Safarinejadian
,
Mohammad B. Menhaj
,
Mehdi Karrari
Journal:
Knowledge and Information Systems  KAIS
, vol. 23, no. 3, pp. 267292, 2010
Singularity and Slow Convergence of the EM algorithm for Gaussian Mixtures
(
Citations: 3
)
Hyeyoung Park
,
Tomoko Ozeki
Journal:
Neural Processing Letters  NPL
, vol. 29, no. 1, pp. 4559, 2009
A Single Loop EM Algorithm for the Mixture of Experts Architecture
(
Citations: 2
)
Yan Yang
,
Jinwen Ma
Conference:
International Symposium on Neural Networks  ISNN
, pp. 959968, 2009
A theoretical framework for multiple neural network systems
(
Citations: 5
)
Mike W. Shields
,
Matthew C. Casey
Journal:
Neurocomputing  IJON
, vol. 71, no. 79, pp. 14621476, 2008