em-tutorial
15 Pages
English
Downloading requires you to have access to the YouScribe library
Learn all about the services we offer
Downloading requires you to have access to the YouScribe library
Learn all about the services we offer
15 Pages
English

Description

NPO>3Q/>=M,%('>)>A('(RS$B$&%BTU8EVXWY$&A>Z[?('*\RX%Y$]W>\U8E^7_a`bVXWY$&A>Z[?('*\RX)>?BT('12-@?>ABAC8ED.-03GFH-I !=!"#-J&KL+.M>I$&%('*),+.-0/12-354*17698:4<;>=A Gentle Tutorial of the EM Algorithmand its Application to ParameterEstimation for Gaussian Mixture andHidden Markov ModelsJeff A. Bilmes (bilmes@cs.berkeley.edu)International Computer Science InstituteBerkeley CA, 94704andComputer Science DivisionDepartment of Electrical Engineering and Computer ScienceU.C. BerkeleyTR-97-021April 1998AbstractWe describe the maximum-likelihood parameter estimation problem and how the Expectation-Maximization (EM) algorithm can be used for its solution. We first describe the abstractform of the EM algorithm as it is often given in the literature. We then develop the EM pa-rameter estimation procedure for two applications: 1) finding the parameters of a mixture ofGaussian densities, and 2) finding the parameters of a hidden Markov model (HMM) (i.e.,the Baum-Welch algorithm) for both discrete and Gaussian mixture observation models.We derive the update equations in fairly explicit detail but we do not prove any conver-gence properties. We try to emphasize intuition rather than mathematical rigor.ii#$45%&.6! 789:;'(& :<9=> 10/)+*-,!.'(! ...

Subjects

Informations

Published by
Reads 31
Language English

Exrait

NPO>3Q/>=
M,%('>)>A('(RS$B$&%BTU8EVXWY$&A>Z[?('*\RX%Y$]W>\U8E^7_a`bVXWY$&A>Z[?('*\RX)>?BT('
12-@?>ABAC8ED.-03GFH-I
!
=
!"#
-J&KL+.M>I
$&%('*),+.-0/12-354*17698:4<;>=



A Gentle Tutorial of the EM Algorithm
and its Application to Parameter
Estimation for Gaussian Mixture and
Hidden Markov Models
Jeff A. Bilmes (bilmes@cs.berkeley.edu)
International Computer Science Institute
Berkeley CA, 94704
and
Computer Science Division
Department of Electrical Engineering and Computer Science
U.C. Berkeley
TR-97-021
April 1998
Abstract
We describe the maximum-likelihood parameter estimation problem and how the Expectation-
Maximization (EM) algorithm can be used for its solution. We first describe the abstract
form of the EM algorithm as it is often given in the literature. We then develop the EM pa-
rameter estimation procedure for two applications: 1) finding the parameters of a mixture of
Gaussian densities, and 2) finding the parameters of a hidden Markov model (HMM) (i.e.,
the Baum-Welch algorithm) for both discrete and Gaussian mixture observation models.
We derive the update equations in fairly explicit detail but we do not prove any conver-
gence properties. We try to emphasize intuition rather than mathematical rigor.ii#$




45


%

&
.6
!
78


9:;
'(

&