15 Pages
English

# em-tutorial

Learn all about the services we offer

Description

NPO>3Q/>=M,%('>)>A('(RS\$B\$&%BTU8EVXWY\$&A>Z[?('*\RX%Y\$]W>\U8E^7_a`bVXWY\$&A>Z[?('*\RX)>?BT('12-@?>ABAC8ED.-03GFH-I !=!"#-J&KL+.M>I\$&%('*),+.-0/12-354*17698:4<;>=A Gentle Tutorial of the EM Algorithmand its Application to ParameterEstimation for Gaussian Mixture andHidden Markov ModelsJeff A. Bilmes (bilmes@cs.berkeley.edu)International Computer Science InstituteBerkeley CA, 94704andComputer Science DivisionDepartment of Electrical Engineering and Computer ScienceU.C. BerkeleyTR-97-021April 1998AbstractWe describe the maximum-likelihood parameter estimation problem and how the Expectation-Maximization (EM) algorithm can be used for its solution. We ﬁrst describe the abstractform of the EM algorithm as it is often given in the literature. We then develop the EM pa-rameter estimation procedure for two applications: 1) ﬁnding the parameters of a mixture ofGaussian densities, and 2) ﬁnding the parameters of a hidden Markov model (HMM) (i.e.,the Baum-Welch algorithm) for both discrete and Gaussian mixture observation models.We derive the update equations in fairly explicit detail but we do not prove any conver-gence properties. We try to emphasize intuition rather than mathematical rigor.ii#\$45%&.6! 789:;'(& :<9=> 10/)+*-,!.'(! ...

Subjects

##### IT systems

Informations

Exrait

NPO>3Q/>=
M,%('>)>A('(RS\$B\$&%BTU8EVXWY\$&A>Z[?('*\RX%Y\$]W>\U8E^7_a`bVXWY\$&A>Z[?('*\RX)>?BT('
12-@?>ABAC8ED.-03GFH-I
!
=
!"#
-J&KL+.M>I
\$&%('*),+.-0/12-354*17698:4<;>=

A Gentle Tutorial of the EM Algorithm
and its Application to Parameter
Estimation for Gaussian Mixture and
Hidden Markov Models
Jeff A. Bilmes (bilmes@cs.berkeley.edu)
International Computer Science Institute
Berkeley CA, 94704
and
Computer Science Division
Department of Electrical Engineering and Computer Science
U.C. Berkeley
TR-97-021
April 1998
Abstract
We describe the maximum-likelihood parameter estimation problem and how the Expectation-
Maximization (EM) algorithm can be used for its solution. We ﬁrst describe the abstract
form of the EM algorithm as it is often given in the literature. We then develop the EM pa-
rameter estimation procedure for two applications: 1) ﬁnding the parameters of a mixture of
Gaussian densities, and 2) ﬁnding the parameters of a hidden Markov model (HMM) (i.e.,
the Baum-Welch algorithm) for both discrete and Gaussian mixture observation models.
We derive the update equations in fairly explicit detail but we do not prove any conver-
gence properties. We try to emphasize intuition rather than mathematical rigor.ii#\$

45

%

&
.6
!
78

9:;
'(

&